**This is an old revision of the document!**
Amazon S3 compatible API
We provide support for the Amazon S3 API. This means you can use the S3 compatible object storage API to access files on Google Docs, RackSpace Cloud Files, or Microsoft OneDrive, or any of the many other clouds that we support.
The S3 API is available on our SaaS services and on the appliance.
It can be used by may tools including:
Authentication
To use the S3 API you will need an:
- Endpoint
- Access Key ID
- Secret Access Key
1. Endpoint
For our SaaS services use one of the following S3 API endpoints:
- http://s3.storagemadeeasy.com (US Server)
- https://s3.storagemadeeasy.com (US Server)
- http://s3eu.storagemadeeasy.com (EU Server)
- https://s3eu.storagemadeeasy.com (EU Server)
Naturally http endpoints are not secure and should not be used for production data.
For appliances contact your administrator for the “Cloud S3 Domain Name”.
2. Access Key ID
The Access Key ID is your SME or File Fabric username.
3. Secret Access Key
You can obtain the API secret key by logging into the File Fabric going to My Account from the sidebar and copying the key from where it says API secret key.
Usefulness
The S3 API can be useful in a number of scenarios. For example:
- Building an S3 compatible Cloud but using one or more different Storage implementations to deliver it, using the SMEStorage Cloud Appliance.
- Transitioning from existing S3 use to a different Cloud but without having to change code ie. only changing the endpoint implementation
- Using S3 Tools and scripts to work with Clouds other than S3
Restrictions
SME S3 API does not support multi-part upload.
Certain S3 client tools may need you to change a setting to turn off multi-part upload.
For example you need to do this with Cyberduck.Multipart uploads can be disabled by setting the hidden option s3.upload.multipart to false.. See https://trac.cyberduck.ch/wiki/help/en/howto/s3 for further information.
Implementation
SME supports the following Amazon S3 API Requests:
- PUT request to create a bucket
- GET to list buckets
- GET to list contents of a bucket (list objects)
- PUT object to upload new file
- DELETE to delete object from bucket
- PUT with header x-amz-copy-source to copy object
- HEAD to get object metadata