Test S3 access
You need to write and run a Python test script for verifying S3 access. The S3 access
test script will connect to the radosgw, create a new bucket, and list all buckets.
The values for aws_access_key_id and aws_secret_access_key are
taken from the values of access_key and secret_key returned by the
radosgw_admin command.
Prerequisites
-
A running IBM Storage Ceph cluster.
-
Root-level access to the nodes.
Procedure
-
Enable high availability repository.
-
Install the
python3-boto3package:dnf install python3-boto3 -
Create the Python script:
vi s3test.py -
Add the following contents to the file:
Syntax
import boto3 endpoint = "" # enter the endpoint _URL_ along with the port "http://URL:PORT" access_key = 'ACCESS' secret_key = 'SECRET' s3 = boto3.client( 's3', endpoint_url=endpoint, aws_access_key_id=access_key, aws_secret_access_key=secret_key ) s3.create_bucket(Bucket='my-new-bucket') response = s3.list_buckets() for bucket in response['Buckets']: print("{name}\t{created}".format( name = bucket['Name'], created = bucket['CreationDate'] ))-
Replace
endpointwith the URL of the host where you have configured the gateway service. That is, thegateway host. Ensure that thehostsetting resolves with DNS. ReplacePORTwith the port number of the gateway. -
Replace
ACCESSandSECRETwith theaccess_keyandsecret_keyvalues
-
-
Run the script:
python3 s3test.pyThe output will be something like the following:
my-new-bucket 2022-05-31T17:09:10.000Z