toil.lib.aws.utils¶
Attributes¶
Exceptions¶
Error to represent that we could not get a location for a bucket. |
Functions¶
|
|
Return true if an error is a connection reset error. |
|
Return True if an error represents a failure to make a network connection. |
|
Return true if this is an error from S3 that looks like we ought to retry our request. |
|
|
Retry iterator of context managers specifically for S3 operations. |
|
Delete the given S3 bucket. |
|
Create an AWS S3 bucket, using the given Boto3 S3 session, with the |
|
Enable a bucket to contain objects which are public. |
|
Get the AWS region name associated with the given S3 bucket, or raise NoBucketLocationError. |
|
|
|
|
|
Extracts a key (object) from a given parsed s3:// URL. |
|
Extracts a key (object) from a given parsed s3:// URL. The URL will be |
|
Convert tags from a key to value dict into a list of 'Key': xxx, 'Value': xxx dicts. |
|
Yield all the results from calling the given Boto 3 method with the |
|
Given a list of attributes, find the attribute associated with the name and return its corresponding value. |
Module Contents¶
- toil.lib.aws.utils.ClientError = None¶
- toil.lib.aws.utils.logger¶
- toil.lib.aws.utils.THROTTLED_ERROR_CODES = ['Throttling', 'ThrottlingException', 'ThrottledException', 'RequestThrottledException',...¶
- toil.lib.aws.utils.delete_sdb_domain(sdb_domain_name, region=None, quiet=True)¶
- toil.lib.aws.utils.connection_reset(e)¶
Return true if an error is a connection reset error.
- toil.lib.aws.utils.connection_error(e)¶
Return True if an error represents a failure to make a network connection.
- toil.lib.aws.utils.retryable_s3_errors(e)¶
Return true if this is an error from S3 that looks like we ought to retry our request.
- toil.lib.aws.utils.retry_s3(delays=DEFAULT_DELAYS, timeout=DEFAULT_TIMEOUT, predicate=retryable_s3_errors)¶
Retry iterator of context managers specifically for S3 operations.
- Parameters:
delays (collections.abc.Iterable[float])
timeout (float)
- Return type:
collections.abc.Iterator[ContextManager[None]]
- toil.lib.aws.utils.delete_s3_bucket(s3_resource, bucket, quiet=True)¶
Delete the given S3 bucket.
- toil.lib.aws.utils.create_s3_bucket(s3_resource, bucket_name, region)¶
Create an AWS S3 bucket, using the given Boto3 S3 session, with the given name, in the given region.
Supports the us-east-1 region, where bucket creation is special.
ALL S3 bucket creation should use this function.
- Parameters:
s3_resource (mypy_boto3_s3.S3ServiceResource)
bucket_name (str)
region (toil.lib.aws.AWSRegionName)
- Return type:
mypy_boto3_s3.service_resource.Bucket
- toil.lib.aws.utils.enable_public_objects(bucket_name)¶
Enable a bucket to contain objects which are public.
This adjusts the bucket’s Public Access Block setting to not block all public access, and also adjusts the bucket’s Object Ownership setting to a setting which enables object ACLs.
Does not touch the account’s Public Access Block setting, which can also interfere here. That is probably best left to the account administrator.
This configuration used to be the default, and is what most of Toil’s code is written to expect, but it was changed so that new buckets default to the more restrictive setting <https://aws.amazon.com/about-aws/whats-new/2022/12/amazon-s3-automatically-enable-block-public-access-disable-access-control-lists-buckets-april-2023/>, with the expectation that people would write IAM policies for the buckets to allow public access if needed. Toil expects to be able to make arbitrary objects in arbitrary places public, and naming them all in an IAM policy would be a very awkward way to do it. So we restore the old behavior.
- Parameters:
bucket_name (str)
- Return type:
None
- exception toil.lib.aws.utils.NoBucketLocationError¶
Bases:
Exception
Error to represent that we could not get a location for a bucket.
- toil.lib.aws.utils.get_bucket_region(bucket_name, endpoint_url=None, only_strategies=None)¶
Get the AWS region name associated with the given S3 bucket, or raise NoBucketLocationError.
Does not log at info level or above when this does not work; failures are expected in some contexts.
Takes an optional S3 API URL override.
- toil.lib.aws.utils.bucket_location_to_region(location)¶
- toil.lib.aws.utils.get_object_for_url(url, existing=None)¶
Extracts a key (object) from a given parsed s3:// URL.
If existing is true and the object does not exist, raises FileNotFoundError.
- Parameters:
existing (bool) – If True, key is expected to exist. If False, key is expected not to exists and it will be created. If None, the key will be created if it doesn’t exist.
url (urllib.parse.ParseResult)
- Return type:
mypy_boto3_s3.service_resource.Object
- toil.lib.aws.utils.list_objects_for_url(url)¶
Extracts a key (object) from a given parsed s3:// URL. The URL will be supplemented with a trailing slash if it is missing.
- Parameters:
url (urllib.parse.ParseResult)
- Return type:
- toil.lib.aws.utils.flatten_tags(tags)¶
Convert tags from a key to value dict into a list of ‘Key’: xxx, ‘Value’: xxx dicts.
- toil.lib.aws.utils.boto3_pager(requestor_callable, result_attribute_name, **kwargs)¶
Yield all the results from calling the given Boto 3 method with the given keyword arguments, paging through the results using the Marker or NextToken, and fetching out and looping over the list in the response with the given attribute name.
- Parameters:
requestor_callable (Callable[Ellipsis, Any])
result_attribute_name (str)
kwargs (Any)
- Return type:
- toil.lib.aws.utils.get_item_from_attributes(attributes, name)¶
Given a list of attributes, find the attribute associated with the name and return its corresponding value.
The attribute_list will be a list of TypedDict’s (which boto3 SDB functions commonly return), where each TypedDict has a “Name” and “Value” key value pair. This function grabs the value out of the associated TypedDict.
If the attribute with the name does not exist, the function will return None.