Hacked By AnonymousFox

Current Path : /opt/cloudlinux/venv/lib/python3.11/site-packages/future/backports/urllib/__pycache__/
Upload File :
Current File : //opt/cloudlinux/venv/lib/python3.11/site-packages/future/backports/urllib/__pycache__/robotparser.cpython-311.pyc

�

�܋f����ddlmZmZmZddlmZ	ddlmZddlm	Z
mZe
e_	ee_dgZ
Gd�de��ZGd�de��ZGd	�d
e��ZdS)�)�absolute_import�division�unicode_literals��str)�urllib)�parse�request�RobotFileParserc�J�eZdZdZd
d�Zd�Zd�Zd�Zd�Zd�Z	d	�Z
d
�Zd�ZdS)rzs This class provides a set of methods to read, parse and answer
    questions about a single robots.txt file.

    �c�v�g|_d|_d|_d|_|�|��d|_dS)NFr)�entries�
default_entry�disallow_all�	allow_all�set_url�last_checked��self�urls  �t/builddir/build/BUILD/cloudlinux-venv-1.0.6/venv/lib/python3.11/site-packages/future/backports/urllib/robotparser.py�__init__zRobotFileParser.__init__s@�����!���!���������S���������c��|jS)z�Returns the time the robots.txt file was last fetched.

        This is useful for long-running web spiders that need to
        check for new robots.txt files periodically.

        )r�rs r�mtimezRobotFileParser.mtime&s
��� � rc�@�ddl}|���|_dS)zYSets the time the robots.txt file was last fetched to the
        current time.

        rN)�timer)rrs  r�modifiedzRobotFileParser.modified/s#��
	���� �I�I�K�K����rc�|�||_tj�|��dd�\|_|_dS)z,Sets the URL referring to a robots.txt file.��N)rrr	�urlparse�host�pathrs  rrzRobotFileParser.set_url7s4�����%�|�4�4�S�9�9�!�A�#�>���	�4�9�9�9rc��	tj�|j��}|���}|�|�d�������dS#tjj	$r:}|j
dvrd|_n|j
dkr
d|_Yd}~dSYd}~dSYd}~dSd}~wwxYw)z4Reads the robots.txt URL and feeds it to the parser.zutf-8)i�i�Ti�N)
rr
�urlopenr�readr	�decode�
splitlines�error�	HTTPError�coderr)r�f�raw�errs    rr)zRobotFileParser.read<s���		9���&�&�t�x�0�0�A��&�&�(�(�C��J�J�s�z�z�'�*�*�5�5�7�7�8�8�8�8�8���|�%�	&�	&�	&��x�:�%�%�$(��!�!���S���!%���������!������"�!�!�!�!�!�����	&���s�$A6�6C�
#B?�?Cc�p�d|jvr|j�	||_dSdS|j�|��dS�N�*)�
useragentsrr�append)r�entrys  r�
_add_entryzRobotFileParser._add_entryIsM���%�"�"�"��!�)�%*��"�"�"�*�)�
�L����&�&�&�&�&rc�R�d}t��}|D�]�}|sB|dkrt��}d}n+|dkr%|�|��t��}d}|�d��}|dkr
|d|�}|���}|s��|�dd��}t|��dk�rH|d������|d<tj�	|d�����|d<|ddkrM|dkr#|�|��t��}|j
�|d��d}��o|ddkr8|dkr0|j�t|dd	����d}���|dd
kr6|dkr0|j�t|dd����d}���|dkr|�|��dSdS)z�Parse the input lines from a robots.txt file.

        We allow that a user-agent: line is not preceded by
        one or more blank lines.
        rr"��#N�:z
user-agent�disallowF�allowT)�Entryr8�find�strip�split�len�lowerrr	�unquoter5r6�	rulelines�RuleLine)r�lines�stater7�line�is      rr	zRobotFileParser.parseRs���������!	"�!	"�D��
��A�:�:�!�G�G�E��E�E��a�Z�Z��O�O�E�*�*�*�!�G�G�E��E��	�	�#���A��A�v�v��B�Q�B�x���:�:�<�<�D��
���:�:�c�1�%�%�D��4�y�y�A�~�~��q�'�-�-�/�/�/�/�1�1��Q�� �,�.�.�t�A�w�}�}���?�?��Q����7�l�*�*���z�z�����.�.�.� %�����$�+�+�D��G�4�4�4��E�E��!�W�
�*�*���z�z���.�.�x��Q���/G�/G�H�H�H� !����!�W��'�'���z�z���.�.�x��Q���/F�/F�G�G�G� !����A�:�:��O�O�E�"�"�"�"�"��:rc��|jrdS|jrdStj�tj�|����}tj�dd|j|j|j	|j
f��}tj�|��}|sd}|jD].}|�
|��r|�|��cS�/|jr|j�|��SdS)z=using the parsed robots.txt decide if useragent can fetch urlFTr
�/)rrrr	r$rE�
urlunparser&�params�query�fragment�quoter�
applies_to�	allowancer)r�	useragentr�
parsed_urlr7s     r�	can_fetchzRobotFileParser.can_fetch�s����	��5��>�	��4��\�*�*�6�<�+?�+?��+D�+D�E�E�
��l�%�%�r�"�Z�_���j�.�
�0C�'E�F�F���l� � ��%�%���	��C��\�	,�	,�E����	�*�*�
,����s�+�+�+�+�+�
,���	5��%�/�/��4�4�4��trc�J�d�d�|jD����S)Nr
c�2�g|]}t|��dz��S)�
r)�.0r7s  r�
<listcomp>z+RobotFileParser.__str__.<locals>.<listcomp>�s#��D�D�D�e��E�
�
�T�)�D�D�Dr)�joinrrs r�__str__zRobotFileParser.__str__�s%���w�w�D�D�t�|�D�D�D�E�E�ErN)r
)
�__name__�
__module__�__qualname__�__doc__rrr rr)r8r	rWr^�rrrrs���������
����!�!�!�(�(�(�?�?�?�
9�9�9�'�'�'�0#�0#�0#�f���.F�F�F�F�Frc�$�eZdZdZd�Zd�Zd�ZdS)rGzoA rule line is a single "Allow:" (allowance==True) or "Disallow:"
       (allowance==False) followed by a path.c�p�|dkr|sd}tj�|��|_||_dS)Nr
T)rr	rRr&rT)rr&rTs   rrzRuleLine.__init__�s6���2�:�:�i�:��I��L�&�&�t�,�,��	�"����rc�L�|jdkp|�|j��Sr3)r&�
startswith)r�filenames  rrSzRuleLine.applies_to�s$���y�C��A�8�#6�#6�t�y�#A�#A�Arc�.�|jrdpddz|jzS)N�Allow�Disallowz: )rTr&rs rr^zRuleLine.__str__�s ����*�7�8�j�D�@�4�9�L�LrN)r_r`rarbrrSr^rcrrrGrG�sS������1�1�#�#�#�B�B�B�M�M�M�M�MrrGc�*�eZdZdZd�Zd�Zd�Zd�ZdS)r?z?An entry has one or more user-agents and zero or more rulelinesc�"�g|_g|_dS)N)r5rFrs rrzEntry.__init__�s���������rc���g}|jD]}|�d|dg���|jD]&}|�t|��dg���'d�|��S)NzUser-agent: rZr
)r5�extendrFrr])r�ret�agentrJs    rr^z
Entry.__str__�su�����_�	6�	6�E��J�J���t�4�5�5�5�5��N�	*�	*�D��J�J��D�	�	�4�(�)�)�)�)��w�w�s�|�|�rc��|�d��d���}|jD]&}|dkrdS|���}||vrdS�'dS)z2check if this entry applies to the specified agentrMrr4TF)rBrDr5)rrUrqs   rrSzEntry.applies_to�sp���O�O�C�(�(��+�1�1�3�3�	��_�	�	�E���|�|��t�t��K�K�M�M�E��	�!�!��t�t�"��urc�V�|jD] }|�|��r	|jcS�!dS)zZPreconditions:
        - our agent applies to this entry
        - filename is URL decodedT)rFrSrT)rrhrJs   rrTzEntry.allowance�sA���N�	&�	&�D����x�(�(�
&��~�%�%�%�
&��trN)r_r`rarbrr^rSrTrcrrr?r?�sV������I�I��������������rr?N)�
__future__rrr�future.builtinsr�future.backportsr�future.backports.urllibr	�_parser
�_request�__all__�objectrrGr?rcrr�<module>r|s��B�B�B�B�B�B�B�B�B�B�������
�$�#�#�#�#�#�H�H�H�H�H�H�H�H��������
��EF�EF�EF�EF�EF�f�EF�EF�EF�PM�M�M�M�M�v�M�M�M�""�"�"�"�"�F�"�"�"�"�"r

Hacked By AnonymousFox1.0, Coded By AnonymousFox