Reference
aios3.file
S3 "files" operations with aiobotocore.
Classes
Functions
aios3.file.chunks
async
chunks(bucket: str, key: str, amt: Optional[int] = None, s3: Optional[AioBaseClient] = None) -> AsyncGenerator[bytearray, None]
Generate file chunks amt
bytes max.
Args: bucket: S3 bucket. key: path in the bucket including "file name". amt: max number of bytes to read in one chunk. by default read file as one chunk. s3: boto s3 client. by default it auto-created inside the function.
Return: Chunks of the file.
aios3.file.read
async
read(bucket: str, key: str, amt: Optional[int] = None, s3: Optional[AioBaseClient] = None) -> bytes
Read the full content of the file as bytes.
Args: bucket: S3 bucket. key: path in the bucket including "file name". amt: max number of bytes to read in one chunk. by default read all. s3: boto s3 client. by default it auto-created inside the function.
Return: The file content as bytes.
aios3.file.save
async
save(bucket: str, key: str, body: bytes, s3: Optional[AioBaseClient] = None) -> None
Create the file with the body
.
Args: bucket: S3 bucket. key: path in the bucket including "file name". body: content to write into file. s3: boto s3 client. by default it auto-created inside the function.
aios3.file.stream
async
stream(bucket: str, key: str, amt: Optional[int] = None, s3: Optional[AioBaseClient] = None) -> IO[bytes]
Create file-like object to stream the file content.
Args: bucket: S3 bucket. key: path in the bucket including "file name". amt: max number of bytes to read in one chunk. by default read file as one chunk. s3: boto s3 client. by default it auto-created inside the function.
Return: Python file stream with the file content.