Skip to content

API Docs - v1.0.3

Tested Siddhi Core version: 5.1.2

It could also support other Siddhi Core minor versions.

Sink

s3 (Sink)

S3 sink publishes events as Amazon AWS S3 buckets.

Syntax

@sink(type="s3", credential.provider.class="<STRING>", aws.access.key="<STRING>", aws.secret.key="<STRING>", bucket.name="<STRING>", aws.region="<STRING>", versioning.enabled="<BOOL>", object.path="<STRING>", storage.class="<STRING>", content.type="<STRING>", bucket.acl="<STRING>", node.id="<STRING>", @map(...)))

QUERY PARAMETERS

Name Description Default Value Possible Data Types Optional Dynamic
credential.provider.class

AWS credential provider class to be used. If blank along with the username and the password, default credential provider will be used.

EMPTY_STRING STRING Yes No
aws.access.key

AWS access key. This cannot be used along with the credential.provider.class

EMPTY_STRING STRING Yes No
aws.secret.key

AWS secret key. This cannot be used along with the credential.provider.class

EMPTY_STRING STRING Yes No
bucket.name

Name of the S3 bucket

STRING No No
aws.region

The region to be used to create the bucket

EMPTY_STRING STRING Yes No
versioning.enabled

Flag to enable versioning support in the bucket

false BOOL Yes No
object.path

Path for each S3 object

STRING No Yes
storage.class

AWS storage class

standard STRING Yes No
content.type

Content type of the event

application/octet-stream STRING Yes Yes
bucket.acl

Access control list for the bucket

EMPTY_STRING STRING Yes No
node.id

The node ID of the current publisher. This needs to be unique for each publisher instance as it may cause object overwrites while uploading the objects to same S3 bucket from different publishers.

EMPTY_STRING STRING Yes No

Examples EXAMPLE 1

@sink(type='s3', bucket.name='user-stream-bucket',object.path='bar/users', credential.provider='software.amazon.awssdk.auth.credentials.ProfileCredentialsProvider', flush.size='3',
    @map(type='json', enclosing.element='$.user', 
        @payload("""{"name": "{{name}}", "age": {{age}}}"""))) 
define stream UserStream(name string, age int);  

This creates a S3 bucket named 'user-stream-bucket'. Then this will collect 3 events together and create a JSON object and save that in S3.