B =@Sa‘tãJ@s<ddlZddlZddlZddlZddlZddlZddlZddlZddlZddl Z ddl Z ddl Z ddl Z ddl Z ddlZddlmZddlZddlZddlZddlmZmZmZmZmZmZmZmZmZmZm Z m!Z!ddl"m#Z#m$Z$ddl%m&Z&m'Z'm(Z(m)Z)m*Z*m+Z+m,Z,m-Z-m.Z.m/Z/m0Z0m1Z1m2Z2m3Z3m4Z4m5Z5m6Z6m7Z7m8Z8m9Z9m:Z:m;Z;ddle?¡Z@dZAdZBd ZCd ZDd ZEe Fd ¡ZGe,e+e-e.fZHd gZIdddddddddddddddddddd d!d"d#d$d%d&d'd(d)d*d+d,d*d-d.d/d0d1d1ddd"d2d3d4d5d6d7d8d/d9d:d:d;d5dd2d?d@dAd6dBdCdDdEdFdGdGdHd'd?dIœIZJdJZKdKZLdLjMeLeKdMZNeLeNdNœZOdOdPdQdRdSdTdUdVdWg ZPdXZQdYdZ Rd[d\„ePDƒ¡d]ZSd^eQd_ZTd`eSdYeTdaZUe FdbeUdc¡ZVeWddƒZXdedf„ZYdgdh„ZZdidj„Z[dkdl„Z\dmdn„Z]dodp„Z^dqdr„Z_dsdt„Z`dâdvdw„Zadxdy„ZbGdzd{„d{ecƒZdGd|d}„d}ecƒZeGd~d„defƒZgGd€d„degƒZhdãdƒd„„Zid…d†„Zjekfd‡dˆ„Zld‰dŠ„ZmeEfd‹dŒ„ZneEfddŽ„Zodd„Zpd‘d’„Zqd“d”„Zrdäd•d–„Zsdåd—d˜„Ztd™dš„Zud›dœ„ZvGddž„džefƒZwGdŸd „d efƒZxd¡d¢„Zyd£d¤„Zzd¥d¦„Z{d§d¨„Z|d©dª„Z}dæd«d¬„Z~dçd­d®„Zd¯d°„Z€d±d²„Zd³d´„Z‚dµd¶„Zƒdèd·d¸„Z„déd¹dº„Z…d»d¼„Z†d½d¾„Z‡Gd¿dÀ„dÀefƒZˆGdÁd„dÂe‰ƒZŠGdÃdÄ„dÄefƒZ‹GdÅdÆ„dÆefƒZŒGdÇdÈ„dÈefƒZGdÉdÊ„dÊefƒZŽGdËdÌ„dÌefƒZGdÍd΄dÎefƒZdÏdЄZ‘dÑdÒ„Z’dêdÔdÕ„Z“dÖdׄZ”dØdÙ„Z•dÚdÛ„Z–dÜdÝ„Z—GdÞdß„dßefƒZ˜Gdàdá„dáefƒZ™dS)ëéN)Útzutc) ÚjsonÚquoteÚ zip_longestÚurlsplitÚ urlunsplitÚ OrderedDictÚsixÚurlparseÚget_tzinfo_optionsÚget_md5Ú MD5_AVAILABLEÚHAS_CRT)Ú getproxiesÚ proxy_bypass)ÚInvalidExpressionErrorÚConfigNotFoundÚInvalidDNSNameErrorÚ ClientErrorÚMetadataRetrievalErrorÚEndpointConnectionErrorÚReadTimeoutErrorÚConnectionClosedErrorÚConnectTimeoutErrorÚUnsupportedS3ArnErrorÚ*UnsupportedS3AccesspointConfigurationErrorÚSSOTokenLoadErrorÚInvalidRegionErrorÚInvalidIMDSEndpointErrorÚInvalidIMDSEndpointModeErrorÚUnsupportedOutpostResourceErrorÚ&UnsupportedS3ControlConfigurationErrorÚUnsupportedS3ControlArnErrorÚInvalidHostLabelErrorÚHTTPClientErrorÚUnsupportedS3ConfigurationErrorÚMissingDependencyException)ÚLocationParseErrorézhttp://169.254.169.254/zhttp://[fd00:ec2::254]/)Úipv4Úipv6z-._~z-z0-9][a-z0-9\-]*[a-z0-9]Ú dualstackzalexa-for-businessZ mediatailorZpricingZ sagemakerz api-gatewayzapplication-auto-scalingZ appstreamz auto-scalingzauto-scaling-plansz cost-explorerz cloudhsm-v2zcloudsearch-domainzcognito-identity-providerzconfig-servicezcost-and-usage-report-serviceziot-data-planeziot-jobs-data-planezmediastore-dataz data-pipelinez device-farmziot-1click-devices-servicezdirect-connectzapplication-discovery-servicezdatabase-migration-servicezdirectory-servicezdynamodb-streamszelastic-beanstalkZefszelastic-load-balancingZemrzelastic-transcoderzelastic-load-balancing-v2Zseszmarketplace-entitlement-servicezelasticsearch-serviceZ eventbridgeziot-1click-projectszkinesis-analyticsz kinesis-videozlex-model-building-servicezlex-runtime-servicezcloudwatch-logszmachine-learningzmarketplace-commerce-analyticszmarketplace-meteringz migration-hubZ cloudwatchZmturkZ opsworkscmzresource-groups-tagging-apizroute-53zroute-53-domainszsagemaker-runtimeZsimpledbzsecrets-managerZserverlessapplicationrepositoryzservice-catalogÚsfnzstorage-gateway)IZa4bZalexaforbusinesszapi.mediatailorz api.pricingz api.sagemakerZ apigatewayzapplication-autoscalingZ appstream2Z autoscalingzautoscaling-plansZceZ cloudhsmv2Zcloudsearchdomainz cognito-idpÚconfigÚcurzdata.iotz data.jobs.iotzdata.mediastoreZ datapipelineZ devicefarmzdevices.iot1clickZ directconnectZ discoveryZdmsZdsZdynamodbstreamsZelasticbeanstalkZelasticfilesystemZelasticloadbalancingZelasticmapreduceZelastictranscoderZelbZelbv2Úemailzentitlement.marketplaceÚesÚeventszcloudwatch-eventsziot-dataz iot-jobs-dataziot1click-devicesziot1click-projectsZkinesisanalyticsZ kinesisvideoz lex-modelsz lex-runtimeZlogsZmachinelearningzmarketplace-entitlementZmarketplacecommerceanalyticszmetering.marketplaceZmeteringmarketplaceZmghz models.lexZ monitoringzmturk-requesterz opsworks-cmzprojects.iot1clickZresourcegroupstaggingapiZroute53Zroute53domainsz runtime.lexzruntime.sagemakerZsdbZsecretsmanagerZserverlessrepoZservicecatalogZstatesZ stepfunctionsZstoragegatewayzstreams.dynamodbZtaggingz(?:[0-9]{1,3}\.){3}[0-9]{1,3}z[0-9A-Fa-f]{1,4}z(?:{hex}:{hex}|{ipv4}))Úhexr))r2Úls32z(?:%(hex)s:){6}%(ls32)sz::(?:%(hex)s:){5}%(ls32)sz%(?:%(hex)s)?::(?:%(hex)s:){4}%(ls32)sz2(?:(?:%(hex)s:)?%(hex)s)?::(?:%(hex)s:){3}%(ls32)sz6(?:(?:%(hex)s:){0,2}%(hex)s)?::(?:%(hex)s:){2}%(ls32)sz/(?:(?:%(hex)s:){0,3}%(hex)s)?::%(hex)s:%(ls32)sz'(?:(?:%(hex)s:){0,4}%(hex)s)?::%(ls32)sz&(?:(?:%(hex)s:){0,5}%(hex)s)?::%(hex)sz(?:(?:%(hex)s:){0,6}%(hex)s)?::zDABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789._!\-~z(?:ú|cCsg|] }|t‘qS©)Ú_subs)Ú.0Úxr5r5úf/private/var/folders/fg/1jzmct0d7d72tjkvm_1nhqc5sw67yj/T/pip-unpacked-wheel-ef76ia09/botocore/utils.pyú ®sr:ú)z (?:%25|%)(?:[z]|%[a-fA-F0-9]{2})+z\[z)?\]ú^ú$z cCst|tƒr|S| ¡dkSdS)z~Ensures a boolean value if a string or boolean is provided For strings, the value for True/False is case insensitive ÚtrueN)Ú isinstanceÚboolÚlower)Úvalr5r5r9Úensure_boolean·s rCcCsL| d¡}|dk r:| ¡}|tkr6|tdœ}tf|Ž‚|S| d¡rHdSdS)z†Resolving IMDS endpoint mode to either IPv6 or IPv4. ec2_metadata_service_endpoint_mode takes precedence over imds_use_ipv6. Ú"ec2_metadata_service_endpoint_modeN)ÚmodeZ valid_modesZ imds_use_ipv6r*r))Zget_config_variablerAÚMETADATA_ENDPOINT_MODESr)ÚsessionZ endpoint_modeZlendpoint_modeZerror_msg_kwargsr5r5r9Úresolve_imds_endpoint_modeÂs  rHcCs2t|dƒo0|j dd¡o0|j d¡dko0|jdkS)zþDetermines if the provided shape is the special header type jsonvalue. :type shape: botocore.shape :param shape: Shape to be inspected for the jsonvalue trait. :return: True if this type is a jsonvalue, False otherwise :rtype: Bool Ú serializationZ jsonvalueFÚlocationÚheaderÚstring)ÚhasattrrIÚgetÚ type_name)Úshaper5r5r9Úis_json_value_header×s rQcCsD|j d|j d|j¡¡}| dd¡}| dd¡}t dd|¡}|S)zvReturns the module name for a service This is the value used in both the documentation and client class name ZserviceAbbreviationZserviceFullNameZAmazonÚZAWSz\W+)ÚmetadatarNZ service_nameÚreplaceÚreÚsub)Z service_modelÚnamer5r5r9Úget_service_module_nameæs   rXcCs|sdSt|ƒS)Nú/)Úremove_dot_segments)Úpathr5r5r9Únormalize_url_pathõsr\cCs|dkr |St|ƒSdS)zLReturns None if val is None, otherwise ensure value converted to booleanN)rC)rBr5r5r9Únormalize_booleanûsr]cCs’|sdS| d¡}g}x8|D]0}|r|dkr|dkrB|rL| ¡q| |¡qW|ddkrbd}nd}|ddkr||r|d}nd}|d |¡|S)NrRrYÚ.z..réÿÿÿÿ)ÚsplitÚpopÚappendÚjoin)ÚurlZ input_urlZ output_listr8ÚfirstÚlastr5r5r9rZs"     rZcCs:|r |dkrt|d‚xdD]}||krt|d‚qWdS)Nr^)Ú expression)ú[ú]Ú*)r)rgÚinvalidr5r5r9Úvalidate_jmespath_for_sets    rlTcCs||r t|ƒ| dd¡}|dt|ƒdkr2|dnd}}|sHt|d‚|rp||kr\i||<t||||ddS|||<dS)Nr^r(rrR)rgF)Úis_first)rlr`ÚlenrÚset_value_from_jmespath)ÚsourcergÚvaluermÚbitsÚ current_keyÚ remainderr5r5r9ro*s " rocCs| di¡}| d¡dk}|S)z9Determine if request is intended for an MRAP accesspoint.Ús3_accesspointÚregionrR)rN)ÚcontextruÚ is_globalr5r5r9Úis_global_accesspointKs ryc@seZdZdZdS)Ú_RetriesExceededErrorz@Internal exception used when the number of retries are exceeded.N)Ú__name__Ú __module__Ú __qualname__Ú__doc__r5r5r5r9rzRsrzc@seZdZdd„ZdS)ÚBadIMDSRequestErrorcCs ||_dS)N)Úrequest)Úselfr€r5r5r9Ú__init__XszBadIMDSRequestError.__init__N)r{r|r}r‚r5r5r5r9rWsrc@s‚eZdZeZdZdZededddfdd„Z dd„Z d d „Z d d „Z dd d„Z dd„Zdd„Zdd„Zdd„Zdd„Zddd„ZdS)Ú IMDSFetcherzlatest/api/tokenZ21600r(NcCsn||_||_| ||¡|_|dkr,tj ¡}| dd¡ ¡|_ |j dk|_ ||_ t j j |jt|jƒd|_dS)NZAWS_EC2_METADATA_DISABLEDÚfalser>)ÚtimeoutÚproxies)Ú_timeoutÚ _num_attemptsÚ_select_base_urlÚ _base_urlÚosÚenvironÚcopyrNrAÚ _disabledÚ _user_agentÚbotocoreÚ httpsessionÚURLLib3SessionÚget_environ_proxiesÚ_session)rr…Z num_attemptsÚbase_urlÚenvÚ user_agentr-r5r5r9r‚bs  zIMDSFetcher.__init__cCs|jS)N)rŠ)rr5r5r9Ú get_base_urlsszIMDSFetcher.get_base_urlcCs„|dkr i}| d¡dk}| d¡}|r6|r6t d¡d}|tkrH|}n|rR|}n|r\t}nt}t d|¡t|ƒs€t|d‚|S)NrDr*Zec2_metadata_service_endpointzFCustom endpoint and IMDS_USE_IPV6 are both set. Using custom endpoint.zIMDS ENDPOINT: %s)Úendpoint)rNÚloggerÚwarningÚMETADATA_BASE_URLÚMETADATA_BASE_URL_IPv6ÚdebugÚ is_valid_urir)rr•r-Z requires_ipv6Zcustom_metadata_endpointZchosen_base_urlr5r5r9r‰vs(   zIMDSFetcher._select_base_urlc Cs$| ¡|j|j}d|ji}| |¡tjjd||d}xät|j ƒD]Ö}yD|j   |  ¡¡}|j dkrl|jS|j dkrzdS|j dkrŒt|ƒ‚WqFtk r¢dStk rÔ}ztjd||dd Wdd}~XYqFtk r}z(t|j d ¡tƒrt||d ‚n‚Wdd}~XYqFXqFWdS) Nz$x-aws-ec2-metadata-token-ttl-secondsÚPUT)ÚmethodrdÚheaderséÈ)i”i“i•)izOCaught retryable HTTP exception while making metadata service request to %s: %sT)Úexc_infoÚerror)r™r¥)Ú_assert_enabledrŠÚ _TOKEN_PATHÚ _TOKEN_TTLÚ_add_user_agentrÚ awsrequestÚ AWSRequestÚrangerˆr”ÚsendÚprepareÚ status_codeÚtextrrÚRETRYABLE_HTTP_ERRORSršržr$r?ÚkwargsrNr'r)rrdr¢r€ÚiÚresponseÚer5r5r9Ú_fetch_metadata_token”s4        z!IMDSFetcher._fetch_metadata_tokenc CsÆ| ¡|dkr|j}|j|}i}|dk r4||d<| |¡xzt|jƒD]l}y2tjjd||d}|j   |  ¡¡}||ƒs~|SWqJt k r´} zt jd|| ddWdd} ~ XYqJXqJW| ¡‚dS)aZMake a get request to the Instance Metadata Service. :type url_path: str :param url_path: The path component of the URL to make a get request. This arg is appended to the base_url that was provided in the initializer. :type retry_func: callable :param retry_func: A function that takes the response as an argument and determines if it needs to retry. By default empty and non 200 OK responses are retried. :type token: str :param token: Metadata token to send along with GET requests to IMDS. Nzx-aws-ec2-metadata-tokenÚGET)r¡rdr¢zOCaught retryable HTTP exception while making metadata service request to %s: %sT)r¤)r¦Ú_default_retryrŠr©r¬rˆrrªr«r”r­r®r±ršržÚ_RETRIES_EXCEEDED_ERROR_CLS) rÚurl_pathÚ retry_funcÚtokenrdr¢r³r€r´rµr5r5r9Ú _get_request³s(   "zIMDSFetcher._get_requestcCs|jdk r|j|d<dS)Nz User-Agent)r)rr¢r5r5r9r©Øs zIMDSFetcher._add_user_agentcCs|jrt d¡| ¡‚dS)Nz)Access to EC2 metadata has been disabled.)rŽršržr¹)rr5r5r9r¦Üs zIMDSFetcher._assert_enabledcCs| |¡p| |¡S)N)Ú_is_non_ok_responseÚ _is_empty)rr´r5r5r9r¸ás zIMDSFetcher._default_retrycCs"|jdkr|j|ddddSdS)Nr£znon-200T)Úlog_bodyF)r¯Ú_log_imds_response)rr´r5r5r9r¾çs zIMDSFetcher._is_non_ok_responsecCs|js|j|ddddSdS)Nzno bodyT)rÀF)ÚcontentrÁ)rr´r5r5r9r¿íszIMDSFetcher._is_emptyFcCs>d}||j|jg}|r*|d7}| |j¡tj|f|žŽdS)NzHMetadata service returned %s response with status code of %s for url: %sz, content body: %s)r¯rdrbrÂršrž)rr´Z reason_to_logrÀZ statementZ logger_argsr5r5r9rÁós  zIMDSFetcher._log_imds_response)N)F)r{r|r}rzr¹r§r¨Ú DEFAULT_METADATA_SERVICE_TIMEOUTrœr‚r˜r‰r¶r½r©r¦r¸r¾r¿rÁr5r5r5r9rƒ\s  %rƒc@sXeZdZdZddddgZdd„Zdd d „Zdd d „Zd d„Zdd„Z dd„Z dd„Z dS)ÚInstanceMetadataFetcherz*latest/meta-data/iam/security-credentials/Ú AccessKeyIdÚSecretAccessKeyÚTokenÚ Expirationc CsÄyl| ¡}| |¡}| ||¡}| |¡rJ||d|d|d|ddœSd|krfd|krft d|¡iSWnR|jk rt d |j¡Yn0tk r¾}zt d |j ¡Wdd}~XYnXiS) NrÅrÆrÇrÈ)Ú role_nameZ access_keyZ secret_keyr¼Z expiry_timeÚCodeÚMessagez7Error response received when retrievingcredentials: %s.z\Max number of attempts exceeded (%s) when attempting to retrieve data from metadata service.zBad IMDS request: %s) r¶Ú _get_iam_roleÚ_get_credentialsÚ_contains_all_credential_fieldsršržr¹rˆrr€)rr¼rÉÚ credentialsrµr5r5r9Úretrieve_iam_role_credentialss(      z5InstanceMetadataFetcher.retrieve_iam_role_credentialsNcCs|j|j|j|djS)N)rºr»r¼)r½Ú _URL_PATHÚ_needs_retry_for_role_namer°)rr¼r5r5r9rÌ)sz%InstanceMetadataFetcher._get_iam_rolecCs$|j|j||j|d}t |j¡S)N)rºr»r¼)r½rÑÚ_needs_retry_for_credentialsrÚloadsr°)rrÉr¼Úrr5r5r9rÍ0s z(InstanceMetadataFetcher._get_credentialscCs6yt |j¡dStk r0| |d¡dSXdS)NFz invalid jsonT)rrÔr°Ú ValueErrorrÁ)rr´r5r5r9Ú_is_invalid_json8s   z(InstanceMetadataFetcher._is_invalid_jsoncCs| |¡p| |¡S)N)r¾r¿)rr´r5r5r9rÒ@s z2InstanceMetadataFetcher._needs_retry_for_role_namecCs| |¡p| |¡p| |¡S)N)r¾r¿r×)rr´r5r5r9rÓFs  z4InstanceMetadataFetcher._needs_retry_for_credentialscCs,x&|jD]}||krt d|¡dSqWdS)Nz3Retrieved credentials is missing required field: %sFT)Ú_REQUIRED_CREDENTIAL_FIELDSršrž)rrÏÚfieldr5r5r9rÎMs z7InstanceMetadataFetcher._contains_all_credential_fields)N)N) r{r|r}rÑrØrÐrÌrÍr×rÒrÓrÎr5r5r5r9rÄs "  rÄFcCs¨x¢|D]š}t||tƒrJ||kr<||krt|tjtjfƒst |¡}t|tjƒs2| d¡}t||dS)aÂUrlencodes a string. Whereas percent_encode_sequence handles taking a dict/sequence and producing a percent encoded string, this function deals only with taking a string (not a dict/sequence) and percent encoding it. If given the binary type, will simply URL encode it. If given the text type, will produce the binary type by UTF-8 encoding the text. If given something else, will convert it to the text type first. zutf-8)rò)r?r Ú binary_typeÚ text_typeÚencoder)Z input_strròr5r5r9rð¸s    rðc Csžt|ttfƒrtj ||ƒ¡Sytj t|ƒ|ƒ¡Sttfk rJYnXytjj |dt ƒidSttfk r˜}ztd||fƒ‚Wdd}~XYnXdS)z.Parse timestamp with pluggable tzinfo options.ÚGMT)ZtzinfoszInvalid timestamp "%s": %sN) r?ÚintÚfloatÚdatetimeÚ fromtimestampÚ TypeErrorrÖÚdateutilÚparserÚparser)rqÚtzinforµr5r5r9Ú_parse_timestamp_with_tzinfoÍsrc Cs`xNtƒD]D}y t||ƒStk rJ}ztjd|j|dWdd}~XYqXqWtd|ƒ‚dS)zÇParse a timestamp into a datetime object. Supported formats: * iso8601 * rfc822 * epoch (value is an integer) This will return a ``datetime.datetime`` object. z2Unable to parse timestamp with "%s" timezone info.)r¤Nz4Unable to calculate correct timezone offset for "%s")r rrãršržr{Ú RuntimeError)rqrrµr5r5r9Úparse_timestampàs  "rcCsDt|tjƒr|}nt|ƒ}|jdkr4|jtƒd}n | tƒ¡}|S)a•Converted the passed in value to a datetime object with tzinfo. This function can be used to normalize all timestamp inputs. This function accepts a number of different types of inputs, but will always return a datetime.datetime object with time zone information. The input param ``value`` can be one of several types: * A datetime object (both naive and aware) * An integer representing the epoch time (can also be a string of the integer, i.e '0', instead of 0). The epoch time is considered to be UTC. * An iso8601 formatted timestamp. This does not need to be a complete timestamp, it can contain just the date portion without the time component. The returned value will be a datetime object that will have tzinfo. If no timezone info was provided in the input value, then UTC is assumed, not local time. N)r)r?rürrrTrÚ astimezone)rqZ datetime_objr5r5r9Úparse_to_aware_datetimeös   rcCs~t ddd¡}|jdkr2|dkr&tƒ}|j|d}|jdd| ¡|}t|dƒr\| ¡S|j|j|j ddddS) awCalculate the timestamp based on the given datetime instance. :type dt: datetime :param dt: A datetime object to be converted into timestamp :type default_timezone: tzinfo :param default_timezone: If it is provided as None, we treat it as tzutc(). But it is only used when dt is a naive datetime. :returns: The timestamp i²r(N)rÚ total_secondséii@B) rürrrTÚ utcoffsetrMrÚ microsecondsÚsecondsÚdays)ÚdtZdefault_timezoneÚepochÚdr5r5r9Údatetime2timestamp#s    rcsFt ¡}x$t‡fdd„dƒD]}| |¡qW|r:| ¡S| ¡SdS)a²Calculate a sha256 checksum. This method will calculate the sha256 checksum of a file like object. Note that this method will iterate through the entire file contents. The caller is responsible for ensuring the proper starting position of the file and ``seek()``'ing the file back to its starting location if other consumers need to read from the file like object. :param body: Any file like object. The file must be opened in binary mode such that a ``.read()`` call returns bytes. :param as_hex: If True, then the hex digest is returned. If False, then the digest (as binary bytes) is returned. :returns: The sha256 checksum cs ˆ d¡S)Ni)rár5)Úbodyr5r9ÚKóz"calculate_sha256..rN)ÚhashlibÚsha256ÚiterÚupdateÚ hexdigestÚdigest)rZas_hexZchecksumÚchunkr5)rr9Úcalculate_sha2568s rcs¼g}d‰tj}x.t‡‡fdd„dƒD]}| ||ƒ ¡¡q$W|sN|dƒ ¡SxXt|ƒdkr¦g}x>t|ƒD]2\}}|dk r’| |||ƒ ¡¡qj| |¡qjW|}qPWt  |d¡  d¡S) a\Calculate a tree hash checksum. For more information see: http://docs.aws.amazon.com/amazonglacier/latest/dev/checksum-calculations.html :param body: Any file like object. This has the same constraints as the ``body`` param in calculate_sha256 :rtype: str :returns: The hex version of the calculated tree hash ics ˆ ˆ¡S)N)rár5)rÚrequired_chunk_sizer5r9rdrz%calculate_tree_hash..rr(NrÚascii) rrrrbrrrnÚ _in_pairsÚbinasciiÚhexlifyÚdecode)rÚchunksrrZ new_chunksreÚsecondr5)rrr9Úcalculate_tree_hashSs r%cCst|ƒ}t||ƒS)N)rr)ÚiterableZ shared_iterr5r5r9rts rc@s eZdZdZdd„Zdd„ZdS)ÚCachedPropertyzÒA read only property that caches the initially computed value. This descriptor will only call the provided ``fget`` function once. Subsequent access to this property will return the cached value. cCs ||_dS)N)Ú_fget)rÚfgetr5r5r9r‚szCachedProperty.__init__cCs,|dkr |S| |¡}||j|jj<|SdS)N)r(Ú__dict__r{)rÚobjÚclsZcomputed_valuer5r5r9Ú__get__s  zCachedProperty.__get__N)r{r|r}r~r‚r-r5r5r5r9r'…sr'c@sDeZdZdZddd„Zdd„Zddd „Zd d „Zd d „Zdd„Z dS)ÚArgumentGeneratoraGenerate sample input based on a shape model. This class contains a ``generate_skeleton`` method that will take an input/output shape (created from ``botocore.model``) and generate a sample dictionary corresponding to the input/output shape. The specific values used are place holder values. For strings either an empty string or the member name can be used, for numbers 0 or 0.0 is used. The intended usage of this class is to generate the *shape* of the input structure. This can be useful for operations that have complex input shapes. This allows a user to just fill in the necessary data instead of worrying about the specific structure of the input arguments. Example usage:: s = botocore.session.get_session() ddb = s.get_service_model('dynamodb') arg_gen = ArgumentGenerator() sample_input = arg_gen.generate_skeleton( ddb.operation_model('CreateTable').input_shape) print("Sample input for dynamodb.CreateTable: %s" % sample_input) FcCs ||_dS)N)Ú_use_member_names)rZuse_member_namesr5r5r9r‚³szArgumentGenerator.__init__cCsg}| ||¡S)zÞGenerate a sample input. :type shape: ``botocore.model.Shape`` :param shape: The input shape. :return: The generated skeleton input corresponding to the provided input shape. )Ú_generate_skeleton)rrPÚstackr5r5r9Úgenerate_skeleton¶s z#ArgumentGenerator.generate_skeletonrRcCsÔ| |j¡z¸|jdkr$| ||¡S|jdkr:| ||¡S|jdkrP| ||¡S|jdkrz|jrd|S|jrvt  |j¡SdS|jdkrˆdS|jdkr–d S|jd kr¤d S|jd krÂt   d ddddd¡SWd|  ¡XdS)NZ structurerÜÚmaprLrR)ÚintegerÚlongr)rûÚdoublegÚbooleanTÚ timestampi²r() rbrWrOÚ_generate_type_structureÚ_generate_type_listÚ_generate_type_mapr/ÚenumÚrandomÚchoicerüra)rrPr1rWr5r5r9r0Ãs.             z$ArgumentGenerator._generate_skeletoncCsJ| |j¡dkriStƒ}x*|j ¡D]\}}|j|||d||<q&W|S)Nr()rW)ÚcountrWrÚmembersrîr0)rrPr1ZskeletonÚ member_nameZ member_shaper5r5r9r9Ýsz*ArgumentGenerator._generate_type_structurecCs$d}|jr|jj}| |j||¡gS)NrR)r/ÚmemberrWr0)rrPr1rWr5r5r9r:æsz%ArgumentGenerator._generate_type_listcCs0|j}|j}|jdkst‚td| ||¡fgƒS)NrLZKeyName)rÞrqrOÚAssertionErrorrr0)rrPr1Z key_shapeZ value_shaper5r5r9r;ðs z$ArgumentGenerator._generate_type_mapN)F)rR) r{r|r}r~r‚r2r0r9r:r;r5r5r5r9r.™s    r.cCs&t |¡rdSt|ƒj}t |¡dk S)NF)ÚUNSAFE_URL_CHARSÚ intersectionr ÚnetlocÚ IPV6_ADDRZ_REÚmatch)Ú endpoint_urlrFr5r5r9Úis_valid_ipv6_endpoint_urlùs  rJcCsht |¡rdSt|ƒ}|j}|dkr(dSt|ƒdkr8dS|ddkrP|dd…}t dtj¡}| |¡S)zèVerify the endpoint_url is valid. :type endpoint_url: string :param endpoint_url: An endpoint_url. Must have at least a scheme and a hostname. :return: True if the endpoint url is valid. False otherwise. FNéÿr_r^z;^((?!-)[A-Z\d-]{1,63}(?s  r^c Ks|jdk rdSt|ƒr$t d¡dSt|jƒ}|j|_|j d¡}|dkrP|j}t |ƒdkr|d}|sndSt d|j¡t |ƒrt |ƒdkr®|jddkr®|jd7_|  |¡d  |¡pÄd}|}|d|} |j | ||jd f} t| ƒ} | |_t d | ¡n t|d ‚dS) a) This is a handler to force virtual host style s3 addressing no matter the signature version (which is taken in consideration for the default case). If the bucket is not DNS compatible an InvalidDNSName is thrown. :param request: A AWSRequest object that is about to be sent. :param signature_version: The signature version to sign with :param default_endpoint_url: The endpoint to use when switching to a virtual style. If None is supplied, the virtual host will be constructed from the url of the request. NzKRequest is GetBucketLocation operation, not checking for DNS compatibility.rYr(z*Checking for DNS compatible bucket for: %sér_r^rRzURI updated to: %s)rX)Z auth_pathÚ_is_get_bucket_location_requestršržrrdr[r`rFrnrZÚremovercÚschemeÚqueryrr) r€r\r]r²rOÚ path_partsrXr[Zglobal_endpointÚhostZ new_tupleZnew_urir5r5r9r[Ts<         r[cCs |j d¡S)Nz ?location)rdÚendswith)r€r5r5r9r`“sr`cs"ˆj‰t ˆ¡‡‡fdd„ƒ}|S)aMethod decorator for caching method calls to a single instance. **This is not a general purpose caching decorator.** In order to use this, you *must* provide an ``_instance_cache`` attribute on the instance. This decorator is used to cache method calls. The cache is only scoped to a single instance though such that multiple instances will maintain their own cache. In order to keep things simple, this decorator requires that you provide an ``_instance_cache`` attribute on your instance. cs\ˆ|f}|r&tt| ¡ƒƒ}ˆ||f}|j |¡}|dk r>|Sˆ|f|ž|Ž}||j|<|S)N)ÚtupleÚsortedrîZ_instance_cacherN)rÚargsr²Ú cache_keyZ kwarg_itemsÚresult)ÚfuncÚ func_namer5r9Ú _cache_guard¨s   z$instance_cache.._cache_guard)r{Ú functoolsÚwraps)rlrnr5)rlrmr9Úinstance_cache—s rqcKsht|jƒj d¡}dd„|Dƒ}d}t|ƒdkrB|d |¡d7}|d7}|dkrVdSt||d d dS) z?Switches the current s3 endpoint with an S3 Accelerate endpointr^cSsg|]}|tkr|‘qSr5)ÚS3_ACCELERATE_WHITELIST)r7Úpr5r5r9r:¿sz-switch_host_s3_accelerate..zhttps://s3-accelerate.rz amazonaws.com)Z ListBucketsÚ CreateBucketZ DeleteBucketNF)Úuse_new_scheme)rrdrFr`rnrcÚ _switch_hosts)r€Zoperation_namer²rOr™r5r5r9Úswitch_host_s3_accelerate·s rwcCs2t |j d¡¡}| |¡r.||}t||ƒdS)zBSwitches the host using a parameter value from a JSON request bodyzutf-8N)rrÔÚdatar"rNrv)r€Ú param_nameZ request_jsonÚ new_endpointr5r5r9Úswitch_host_with_paramÊs r{cCst|j||ƒ}||_dS)N)Ú_get_new_endpointrd)r€rzruÚfinal_endpointr5r5r9rvÒs rvcCsRt|ƒ}t|ƒ}|j}|r |j}||j|j|jdf}t|ƒ}t d||f¡|S)NrRzUpdating URI from %s to %s)rrbrFr[rcrršrž)Zoriginal_endpointrzruZnew_endpoint_componentsZoriginal_endpoint_componentsrbZfinal_endpoint_componentsr}r5r5r9r|Øs r|cCsVxP|D]H}||krBt||tƒrBt||tƒrBt||||ƒq||||<qWdS)zõDeeply two dictionaries, overriding existing keys in the base. :param base: The base dictionary which will be merged into. :param extra: The dictionary to merge into the base. Keys from this dictionary will take precedence. N)r?rÚÚ deep_merge)ÚbaseÚextrarÞr5r5r9r~ës  r~cCs| dd¡ ¡S)zcTranslate the form used for event emitters. :param service_id: The service_id to convert. ú ú-)rTrA)Z service_idr5r5r9Úhyphenize_service_idýsrƒc@sHeZdZddd„Zddd„Zdd„Zdd „Zd d „Zd d „Zdd„Z dS)ÚS3RegionRedirectorNcCs,||_||_|jdkri|_t |¡|_dS)N)Ú_endpoint_resolverÚ_cacheÚweakrefÚproxyÚ_client)rZendpoint_bridgeÚclientÚcacher5r5r9r‚s  zS3RegionRedirector.__init__cCs<|p |jjj}| d|j¡| d|j¡| d|j¡dS)Nzneeds-retry.s3zbefore-call.s3zbefore-parameter-build.s3)r‰Úmetar1ÚregisterÚredirect_from_errorÚset_request_urlÚredirect_from_cache)rÚ event_emitterZemitterr5r5r9rs zS3RegionRedirector.registercKs¨|dkr dS| | di¡¡r,t d¡dS| di¡ d¡rLt d¡dS|d di¡}| d¡}|d d i¡}|d ko†|jd k}|d ko¨|jd ko¨d | di¡k} |dko¸d|k} |ddk oÒ|djdk} |dk} t|| | | | gƒsòdS|ddd} |d d¡}| | |¡}|dkraccesspoint|outpost)[/:](?P.+)$zc^(?P[a-zA-Z0-9\-]{1,63})[/:]accesspoint[/:](?P[a-zA-Z0-9\-]{1,63}$)rtNcCs||_|dkrtƒ|_dS)N)Ú _arn_parserr¤)rÚ arn_parserr5r5r9r‚ÆszS3ArnParamHandler.__init__cCs| d|j¡dS)Nzbefore-parameter-build.s3)rÚ handle_arn)rr‘r5r5r9rËszS3ArnParamHandler.registercKs`|j|jkrdS| |¡}|dkr&dS|ddkrB| |||¡n|ddkr\| |||¡dS)NÚ resource_typeÚ accesspointÚoutpost)rWÚ_BLACKLISTED_OPERATIONSÚ"_get_arn_details_from_bucket_paramÚ_store_accesspointÚ_store_outpost)rr¢Úmodelrwr²Ú arn_detailsr5r5r9r²Îs    zS3ArnParamHandler.handle_arncCsFd|krBy$|d}|j |¡}| ||¡|Stk r@YnXdS)Nr )r°r®Ú_add_resource_type_and_namer£)rr¢r­r»r5r5r9r·Ùs  z4S3ArnParamHandler._get_arn_details_from_bucket_paramcCs@|j |d¡}|r2| d¡|d<| d¡|d<n t|d‚dS)Nr¬r³Ú resource_name)r­)Ú_RESOURCE_REGEXrHÚgroupr)rr­r»rHr5r5r9r¼äs z-S3ArnParamHandler._add_resource_type_and_namecCs8|d|d<|d|d|d|d|ddœ|d<dS) Nr½r r«r©rvrª)rWr«r©rvrªrur5)rr¢rwr»r5r5r9r¸ìs  z$S3ArnParamHandler._store_accesspointcCsd|d}|j |¡}|s"t|d‚| d¡}||d<| d¡||d|d|d|d d œ|d <dS) Nr½)r½Úaccesspoint_namer Ú outpost_namer«r©rvrª)rÁrWr«r©rvrªru)Ú_OUTPOST_RESOURCE_REGEXrHr r¿)rr¢rwr»r½rHrÀr5r5r9r¹ýs   z S3ArnParamHandler._store_outpost)N)r{r|r}rUrMr¾rÂr¶r‚rr²r·r¼r¸r¹r5r5r5r9r¯ºs   r¯c@sêeZdZdZdZd6dd„Zdd„Zdd „Zd d „Zd d „Z dd„Z dd„Z dd„Z dd„Z dd„Zdd„Zdd„Zdd„Zdd„Zd d!„Zd"d#„Zd$d%„Zd&d'„Zd(d)„Zd*d+„Zd,d-„Zd.d/„Zed0d1„ƒZed2d3„ƒZed4d5„ƒZdS)7ÚS3EndpointSetterÚawsz amazonaws.comNcCs@||_||_||_|dkr i|_||_||_|dkr<|j|_dS)N)r…Ú_regionÚ _s3_configÚ _endpoint_urlÚ _partitionÚ_DEFAULT_PARTITION)rÚendpoint_resolverrvÚ s3_configrIr©r5r5r9r‚szS3EndpointSetter.__init__cCs.| d|j¡| d|j¡| d|j¡dS)Nzbefore-sign.s3zchoose-signer.s3z%before-call.s3.WriteGetObjectResponse)rÚ set_endpointÚ set_signerÚ#update_endpoint_to_s3_object_lambda)rr‘r5r5r9r!s zS3EndpointSetter.registercKsh|jrtdd‚| |d¡|jr&dS|j}| d|j¡}dj|d|dd}t|d|d ƒ|d<dS) NzOS3 client does not support accelerate endpoints for S3 Object Lambda operations)Úmsgzs3-object-lambdazhttps://{host_prefix}{hostname}Ú host_prefixrL)rÐrLrdF) Ú_use_accelerate_endpointr%Ú_override_signing_namerÇr…Úconstruct_endpointrÅÚformatr|)rr¢rwr²ÚresolverÚresolvedrzr5r5r9rÎ)s  z4S3EndpointSetter.update_endpoint_to_s3_object_lambdacKs‚| |¡rL| |¡| |¡| |¡| |¡}| |¡| ||¡dS|jrdtfd|i|—Ž|j r~|j fd|i|—ŽdS)Nr€) Ú_use_accesspoint_endpointÚ_validate_accesspoint_supportedÚ_validate_fips_supportedÚ_validate_global_regionsÚ(_resolve_region_for_accesspoint_endpointÚ._resolve_signing_name_for_accesspoint_endpointÚ_switch_to_accesspoint_endpointrÑrwÚ_s3_addressing_handler)rr€r²rRr5r5r9rÌ@s     zS3EndpointSetter.set_endpointcCs d|jkS)Nru)rw)rr€r5r5r9r×Psz*S3EndpointSetter._use_accesspoint_endpointcCs”d|jkrdSd|jdkr,td|jd‚|j dd¡ dd¡}|jdd }||kr|j d d ¡r|td |j|fd‚ntd |j|fd‚dS)NÚfipsrÁruzhClient is configured to use the FIPS psuedo-region "%s", but outpost ARNs do not support FIPS endpoints.)rÏzfips-rRz-fipsrvÚuse_arn_regionTzâClient is configured to use the FIPS psuedo-region "%s", but the access-point ARN provided is for the "%s" region. The use_arn_region configuration does not allow for cross-region calls when a FIPS pseudo-region is configured.z×Client is configured to use the FIPS psuedo-region "%s", but the access-point ARN provided is for the "%s" region. For clients using a FIPS psuedo-region calls to access-point ARNs in another region are not allowed.)rÅrwrrTrÆrN)rr€r˜Úaccesspoint_regionr5r5r9rÙSs   z)S3EndpointSetter._validate_fips_supportedcCs0|j dd¡rdS|jdkr,td|jd‚dS)NràT)z aws-globalz s3-external-1z‚Client is configured to use the global psuedo-region "%s". When providing access-point ARNs a regional endpoint must be specified.)rÏ)rÆrNrÅr)rr€r5r5r9rÚws  z)S3EndpointSetter._validate_global_regionscCs¢|jrtdd‚|jdd}||jkrosz.) rÆrNrÇrrFrfr`rnÚsetÚall)rrFrOZ feature_partsr5r5r9rÑNs       z)S3EndpointSetter._use_accelerate_endpointcCs"|jr dS|j d¡}|r|SdS)NÚvirtualZaddressing_style)rÑrÆrN)rZconfigured_addressing_styler5r5r9Ú_addressing_styleqs  z"S3EndpointSetter._addressing_stylecCsH|jdkrt d¡tS|jdks,|jdk r:t d¡dSt d¡tS)Nrùz'Using S3 virtual host style addressing.r[zUsing S3 path style addressing.zSDefaulting to S3 virtual host style addressing with path style addressing fallback.)rúršržr[rÇr^)rr5r5r9rÞ}s    z'S3EndpointSetter._s3_addressing_handler)NNNN)r{r|r}rÉrðr‚rrÎrÌr×rÙrÚrØrãrÛrÍrÜrÝrårèrérírærërîrärÒr'rÑrúrÞr5r5r5r9rÃs8 $ $  !   # rÃc@sØeZdZdZdZe d¡Zd3dd„Zdd„Z d d „Z d d „Z d d„Z dd„Z dd„Zdd„Zdd„Zdd„Zdd„Zdd„Zdd„Zdd „Zd!d"„Zd#d$„Zd%d&„Zd'd(„Zd)d*„Zd+d,„Zd-d.„Zd/d0„Zd1d2„ZdS)4ÚS3ControlEndpointSetterrÄz amazonaws.comz^[a-zA-Z0-9\-]{1,63}$NcCs@||_||_||_|dkr i|_||_||_|dkr<|j|_dS)N)r…rÅrÆrÇrÈrÉ)rrÊrvrËrIr©r5r5r9r‚šsz S3ControlEndpointSetter.__init__cCs| d|j¡dS)Nzbefore-sign.s3-control)rrÌ)rr‘r5r5r9r¦sz S3ControlEndpointSetter.registercKs†| |¡r@| |¡| |¡}| |¡| ||¡| |¡nB| |¡r‚| |¡|jd}|  |d¡|  |j ¡}|  ||¡dS)NÚ outpost_idz s3-outposts) Ú_use_endpoint_from_arn_detailsÚ-_validate_endpoint_from_arn_details_supportedÚ _resolve_region_from_arn_detailsÚ&_resolve_signing_name_from_arn_detailsÚ"_resolve_endpoint_from_arn_detailsÚ_add_headers_from_arn_detailsÚ_use_endpoint_from_outpost_idÚ#_validate_outpost_redirection_validrwrÒÚ_construct_outpost_endpointrÅÚ_update_request_netloc)rr€r²rRrüÚ new_netlocr5r5r9rÌ©s           z$S3ControlEndpointSetter.set_endpointcCs d|jkS)Nr»)rw)rr€r5r5r9rý·sz6S3ControlEndpointSetter._use_endpoint_from_arn_detailscCs d|jkS)Nrü)rw)rr€r5r5r9rºsz5S3ControlEndpointSetter._use_endpoint_from_outpost_idcCsœ|j dd¡s>|jdd}||jkr>d||jf}t|d‚|jdd}||jkrjtd|j|fd‚|j d ¡r€td d‚d |jdkr˜| |¡dS) NràFr»rvzpThe use_arn_region configuration is disabled but received arn for "%s" when the client is configured to use "%s")rÏr©zClient is configured for "%s" partition, but arn provided is for "%s" partition. The client and arn partition must be the same.rôz7S3 control client does not support accelerate endpointsrÁ)rÆrNrwrÅr!rÈr)rr€Ú arn_regionÚ error_msgZrequest_partionr5r5r9rþ½s      zES3ControlEndpointSetter._validate_endpoint_from_arn_details_supportedcCs|j d¡rtdd‚dS)NrâzPClient does not support s3 dualstack configuration when an outpost is specified.)rÏ)rÆrNr!)rr€r5r5r9rØs z;S3ControlEndpointSetter._validate_outpost_redirection_validcCs2|j dd¡r,|jdd}| ||¡|S|jS)NràFr»rv)rÆrNrwrärÅ)rr€rr5r5r9rÿás  z8S3ControlEndpointSetter._resolve_region_from_arn_detailscCs|jdd}| ||¡|S)Nr»rª)rwrÒ)rr€Z arn_servicer5r5r9rês z>S3ControlEndpointSetter._resolve_signing_name_from_arn_detailscCs| ||¡}| ||¡dS)N)Ú _resolve_netloc_from_arn_detailsr)rr€rRrr5r5r9rïs z:S3ControlEndpointSetter._resolve_endpoint_from_arn_detailscCs@t|jƒ}t|j||j|jdfƒ}t d|j|f¡||_dS)NrRzUpdating URI from %s to %s)rrdrrbr[rcršrž)rr€rrçZarn_details_endpointr5r5r9rós z.S3ControlEndpointSetter._update_request_netloccCs0|jd}d|kr| |¡S|d}| ||¡S)Nr»rÁr«)rwrÚ_construct_s3_control_endpoint)rr€rRr»r«r5r5r9r s   z8S3ControlEndpointSetter._resolve_netloc_from_arn_detailscCs |j |¡S)N)Ú_HOST_LABEL_REGEXrH)rÚlabelr5r5r9Ú_is_valid_host_labelsz,S3ControlEndpointSetter._is_valid_host_labelcGs&x |D]}| |¡st|d‚qWdS)N)r )rr#)rÚlabelsr r5r5r9Ú_validate_host_labels s  z-S3ControlEndpointSetter._validate_host_labelscCs\| ||¡|jr(t|jƒj}||g}n*|dg}| |¡| |¡}| ||g¡| |¡S)Nz s3-control)rrÇrrFÚ_add_dualstackrîrÝÚ_construct_netloc)rrRr«rìrFrñr5r5r9r s     z6S3ControlEndpointSetter._construct_s3_control_endpointcCs6| |¡|jrt|jƒjSd|| |¡g}| |¡S)Nz s3-outposts)rrÇrrFrîr)rrRrFr5r5r9rs   z3S3ControlEndpointSetter._construct_outpost_endpointcCs d |¡S)Nr^)rc)rrFr5r5r9r+sz)S3ControlEndpointSetter._construct_netloccCs|j d¡r| d¡dS)Nrâr+)rÆrNrb)rrFr5r5r9r.s z&S3ControlEndpointSetter._add_dualstackcCs,|j d|¡}|j}|r(d|kr(|d}|S)Nr™rò)r…rÓrð)rrRrÖrñr5r5r9rî2s  z'S3ControlEndpointSetter._get_dns_suffixcCs$|j di¡}||d<||jd<dS)Nr–rv)rwrN)rr€rRrŸr5r5r9rä:sz0S3ControlEndpointSetter._override_signing_regioncCs$|j di¡}||d<||jd<dS)Nr–ró)rwrN)rr€rórŸr5r5r9rÒCsz.S3ControlEndpointSetter._override_signing_namecCs(|jd}| d¡}|r$| ||¡dS)Nr»rÁ)rwrNÚ_add_outpost_id_header)rr€r»rÁr5r5r9rLs  z5S3ControlEndpointSetter._add_headers_from_arn_detailscCs||jd<dS)Nzx-amz-outpost-id)r¢)rr€rÁr5r5r9rRsz.S3ControlEndpointSetter._add_outpost_id_header)NNNN)r{r|r}rÉrðrUrMr r‚rrÌrýrrþrrÿrrrr rrr rrrrîrärÒrrr5r5r5r9rû•s6       rûc@s€eZdZe d¡Zddd„Zdd„Zdd„Zd d „Z d d „Z d d„Z dd„Z dd„Z dd„Zdd„Zdd„Zdd„Zdd„ZdS)ÚS3ControlArnParamHandlerz[/:]NcCs||_|dkrtƒ|_dS)N)r°r¤)rr±r5r5r9r‚Ysz!S3ControlArnParamHandler.__init__cCs| d|j¡dS)Nz!before-parameter-build.s3-control)rr²)rr‘r5r5r9r^sz!S3ControlArnParamHandler.registercKs:|jdkr| |||¡n| |||¡| |||¡dS)N)rtZListRegionalBuckets)rWÚ_handle_outpost_id_paramÚ_handle_name_paramÚ_handle_bucket_param)rr¢rºrwr²r5r5r9r²ds z#S3ControlArnParamHandler.handle_arncCsT||kr dSy.||}|j |¡}||d<| |¡|d<|Stk rNdSXdS)NrßÚ resources)r°r®Ú_split_resourcer£)rr¢ryr­r»r5r5r9Ú_get_arn_details_from_paramns z4S3ControlArnParamHandler._get_arn_details_from_paramcCs|j |d¡S)Nr¬)Ú_RESOURCE_SPLIT_REGEXr`)rr»r5r5r9rzsz(S3ControlArnParamHandler._split_resourcecCsD|d}d|kr8|d|kr8d|d}t|d|d‚||d<dS)Nr«Z AccountIdzGAccount ID in arn does not match the AccountId parameter provided: "%s"rß)r­rÏ)r")rr¢r»Z account_idr r5r5r9Ú_override_account_id_param}s z3S3ControlArnParamHandler._override_account_id_paramcCsd|kr dS|d|d<dS)NZ OutpostIdrür5)rr¢rºrwr5r5r9rŠsz1S3ControlArnParamHandler._handle_outpost_id_paramcCsX|jdkrdS| |d¡}|dkr&dS| |¡r@| |||¡nd}t|d|d‚dS)NZCreateAccessPointÚNamez4The Name parameter does not support the provided ARNrß)r­rÏ)rWrÚ_is_outpost_accesspointÚ_store_outpost_accesspointr")rr¢rºrwr»r r5r5r9rs   z+S3ControlArnParamHandler._handle_name_paramcCs@|ddkrdS|d}t|ƒdkr(dS|ddko>|dd kS) Nrªz s3-outpostsFrr¨rrµr_r´)rn)rr»rr5r5r9rŸs   z0S3ControlArnParamHandler._is_outpost_accesspointcCsD| ||¡|dd}||d<||d<|dd|d<||d<dS)NrrTrrÀr(rÁr»)r)rr¢rwr»rÀr5r5r9r¨s   z3S3ControlArnParamHandler._store_outpost_accesspointcCsJ| |d¡}|dkrdS| |¡r2| |||¡nd}t|d|d‚dS)Nr z6The Bucket parameter does not support the provided ARNrß)r­rÏ)rÚ_is_outpost_bucketÚ_store_outpost_bucketr")rr¢rºrwr»r r5r5r9r°s  z-S3ControlArnParamHandler._handle_bucket_paramcCs@|ddkrdS|d}t|ƒdkr(dS|ddko>|dd kS) Nrªz s3-outpostsFrr¨rrµr_r—)rn)rr»rr5r5r9r ¿s   z+S3ControlArnParamHandler._is_outpost_bucketcCsD| ||¡|dd}||d<||d<|dd|d<||d<dS)NrrTr rXr(rÁr»)r)rr¢rwr»rXr5r5r9r!Ès   z.S3ControlArnParamHandler._store_outpost_bucket)N)r{r|r}rUrMrr‚rr²rrrrrrrrr r!r5r5r5r9rVs       rc@sreZdZdZdZdZdZeddgZdej fdd „Z dd d „Z d d „Z dd„Z dd„Zddd„Zdd„Zdd„ZdS)ÚContainerMetadataFetcherr_rTr(z 169.254.170.2Ú localhostz 127.0.0.1NcCs(|dkrtjj|jd}||_||_dS)N)r…)rr‘r’ÚTIMEOUT_SECONDSr”Ú_sleep)rrGÚsleepr5r5r9r‚Ùs  z!ContainerMetadataFetcher.__init__cCs| |¡| ||¡S)zôRetrieve JSON metadata from container metadata. :type full_url: str :param full_url: The full URL of the metadata service. This should include the scheme as well, e.g "http://localhost:123/foo" )Ú_validate_allowed_urlÚ_retrieve_credentials)rÚfull_urlr¢r5r5r9Úretrieve_full_uriás z*ContainerMetadataFetcher.retrieve_full_uricCs:tj |¡}| |j¡}|s6td|jd |j¡fƒ‚dS)NzGUnsupported host '%s'. Can only retrieve metadata from these hosts: %sz, )rÚcompatr Ú_check_if_whitelisted_hostrLrÖrcÚ_ALLOWED_HOSTS)rr)ÚparsedZis_whitelisted_hostr5r5r9r'ís z.ContainerMetadataFetcher._validate_allowed_urlcCs||jkrdSdS)NTF)r-)rrer5r5r9r,÷s z3ContainerMetadataFetcher._check_if_whitelisted_hostcCs| |¡}| |¡S)zÃRetrieve JSON metadata from ECS metadata. :type relative_uri: str :param relative_uri: A relative URI, e.g "/foo/bar?id=123" :return: The parsed JSON response. )r)r()rÚ relative_urir)r5r5r9Ú retrieve_uriüs z%ContainerMetadataFetcher.retrieve_uric CsŒddi}|dk r| |¡d}xhy| |||j¡Stk r‚}z4tjd|dd| |j¡|d7}||jkrr‚Wdd}~XYq Xq WdS)NÚAcceptzapplication/jsonrzAReceived error when attempting to retrieve container metadata: %sT)r¤r() rÚ _get_responser$rršržr%Ú SLEEP_TIMEÚRETRY_ATTEMPTS)rr)Ú extra_headersr¢Úattemptsrµr5r5r9r( s     z.ContainerMetadataFetcher._retrieve_credentialsc CsÊytjj}|d||d}|j | ¡¡}|j d¡}|jdkrRt d|j|fd‚y t   |¡St k rŒd}t  d||¡t |d‚YnXWn4tk rÄ} zd | }t |d‚Wdd} ~ XYnXdS) Nr·)r¡rdr¢zutf-8r£z4Received non 200 response (%s) from ECS metadata: %s)r z8Unable to parse JSON returned from ECS metadata servicesz%s:%sz;Received error when attempting to retrieve ECS metadata: %s)rrªr«r”r­r®rÂr"r¯rrrÔrÖršržr±) rr)r¢r…r«r€r´Z response_textr rµr5r5r9r2 s&   z&ContainerMetadataFetcher._get_responsecCsd|j|fS)Nz http://%s%s)Ú IP_ADDRESS)rr/r5r5r9r)1 sz!ContainerMetadataFetcher.full_url)N)N)r{r|r}r$r4r3r7r-Útimer&r‚r*r'r,r0r(r2r)r5r5r5r9r"Ñs    r"cCst|ƒr iStƒSdS)N)Úshould_bypass_proxiesr)rdr5r5r9r“5 sr“c Cs6ytt|ƒjƒrdSWnttjfk r0YnXdS)z: Returns whether we should bypass proxies or not. TF)rr rFrþÚsocketÚgaierror)rdr5r5r9r9< s r9ú ISO-8859-1cCsF| d¡}|sdSt |¡\}}d|kr6|d d¡Sd|krB|SdS)z®Returns encodings from given HTTP Header Dict. :param headers: dictionary to extract encoding from. :param default: default encoding if the content-type is text z content-typeNÚcharsetz'"r°)rNÚcgiÚ parse_headerrë)r¢ÚdefaultÚ content_typer¢r5r5r9Úget_encoding_from_headersQ s rBcKs0t|ttfƒrt|ƒ}nt|ƒ}t |¡ d¡S)Nr)r?ÚbytesÚ bytearrayÚ_calculate_md5_from_bytesÚ_calculate_md5_from_fileÚbase64Ú b64encoder")rr²Z binary_md5r5r5r9Ú calculate_md5f s rIcCst|ƒ}| ¡S)N)r r)Z body_bytesÚmd5r5r5r9rEn srEcsFˆ ¡}tƒ}x$t‡fdd„dƒD]}| |¡q"Wˆ |¡| ¡S)Ncs ˆ d¡S)Ni)rár5)Úfileobjr5r9rv rz*_calculate_md5_from_file..r)Útellr rrÚseekr)rKZstart_positionrJrr5)rKr9rFs s  rFcKs@|d}|d}tr<|dk rÚwarningsZdateutil.parserrÿZ dateutil.tzrrZbotocore.awsrequestZbotocore.httpsessionZbotocore.compatrrrrrrr r r r r rZ*botocore.vendored.six.moves.urllib.requestrrZbotocore.exceptionsrrrrrrrrrrrrrrrr r!r"r#r$r%r&Zurllib3.exceptionsr'Ú getLoggerr{ršrÃrœrrFZ SAFE_CHARSrMrVr±rrZ EVENT_ALIASESÚIPV4_PATÚHEX_PATrÔÚLS32_PATr6Ú _variationsÚUNRESERVED_PATrcÚIPV6_PATÚ ZONE_ID_PATÚIPV6_ADDRZ_PATrGÚ frozensetrDrCrHrQrXr\r]rZrlroryÚ ExceptionrzrÚobjectrƒrÄrÛràrRrèrârõrðrrrrrr%rr'r.rJrQrŸrSrZr^r[r`rqrwr{rvr|r~rƒr„rÖr£r¤r¯rÃrûrr"r“r9rBrIrErFrNrOrSr5r5r5r9Ú s~ 8`        !&V   $ -  !`   >   !WB{d