Upload files to AWS S3 in k6

In my final put up, we mentioned easy methods to add information to AWS S3 in JMeter using Groovy. We have now additionally seen Grape package deal supervisor on JMeter. Not too long ago, k6 introduced its subsequent iteration with quite a lot of new options and fixes. On this weblog put up, we’re going to see easy methods to add information to AWS S3 in k6.

What’s new in k6 0.38.1?

k6 0.38.1 is a minor patch. To view all the brand new options, take a look at k6 0.38.0 tag. Following are the noteworthy options in 0.38.0.

  • AWS JSLib
  • Tagging metric values
  • Dumping SSL keys to an NSS formatted key log file
  • Accessing the consolidated and derived choices from the default operate

AWS JSLib Options

Amazon Internet Companies is likely one of the extensively used public cloud platforms. AWS JSLib full of three modules (as of right now’s writing) :

  • S3 Shopper
  • Secrets and techniques Supervisor Shopper
  • AWS Config

S3 Shopper

S3Client modules assist to record S3 buckets, create, add, and delete S3 objects. To entry the AWS companies, every consumer (on this case S3Client) makes use of the AWS Credentials resembling Entry Key ID and Secret Entry Key. The credentials should have required privileges to carry out the duties, in any other case the request will fail.

Stipulations to add objects to S3

The next are the conditions to add objects to S3 in k6.

  • The newest model of k6 0.38.0 or above
  • AWS Entry Key ID and Secret Entry Key which has related S3 permissions.
  • Fundamental information in k6

https://www.youtube.com/playlist?list=PLJ9A48W0kpRJKmVeurt7ltKfrOdr8ZBdt

Be taught k6 Sequence

add information to S3?

Create a brand new k6 script in your favourite editor and title it as uploadFiletoS3.js. To add information to S3, step one is to import the next directives to your k6 script.

import exec from 'k6/execution'

import {
    AWSConfig,
    S3Client,
} from 'https://jslib.k6.io/aws/0.3.0/s3.js'

The following step is to learn the AWS credentials from the command line. It isn’t really useful to onerous code the secrets and techniques into the code, which can increase a safety concern. Utilizing the __ENV variables, it’s simple to move the values into the k6 script. To move the AWS config resembling area, entry key, and secret, use the thing AWSConfig as proven under.

const awsConfig = new AWSConfig(
    __ENV.AWS_REGION,
    __ENV.AWS_ACCESS_KEY_ID,
    __ENV.AWS_SECRET_ACCESS_KEY
)

The following step is to create a S3 consumer by wrapping AWSConfig into it utilizing the under code.

const s3 = new S3Client(awsConfig);

S3Client s3 has the next strategies:

  • listBuckets()
  • listObjects(bucketName, [prefix])
  • getObject(bucketName, objectKey)
  • putObject(bucketName, objectKey, knowledge)
  • deleteObject(bucketName, objectKey)

Now, we all know the s3 strategies. The following step is to create a dummy file to add. Create a file with pattern contents and put it aside in a present listing, e.g. check.txt

After making a dummy file to add, we have to load that file into the script utilizing the open() technique. Copy and paste the under code:

const knowledge = open('check.txt', 'r')
const testBucketName="k6test"
const testFileKey = 'check.txt'

open() technique reads the contents of a file and masses them into reminiscence which shall be used within the script. open() technique takes two arguments: file path and the mode. By default, it’ll learn it as textual content r, to learn it as binary use b.

The above open technique works solely in an init context. Please make an observation.

The above variables knowledge, testBucketName and testFileKey maintain the information to add, bucket title in S3, and file key respectively.

The following step is to outline the fundamental() context. Allow us to start with the itemizing of the buckets. The under variable buckets will return the array which can comprise every bucket object.

const buckets = s3.listBuckets()

Optionally, if you want to loop via the bucket, use the under code snippet.

for (let bucket in buckets) {
        console.log(buckets[bucket].title);
}

Or you should use filter() technique as proven under.

buckets.filter(bucket => bucket.title === testBucketName)

Allow us to add a checkpoint whether or not the bucket is current or not. If the bucket is current, it’ll proceed to add, else the execution will abort. Copy and paste the under snippet.

if (buckets.filter((bucket) => bucket.title === testBucketName).size == 0) {
        exec.check.abort()
}

The following step is to add the thing to S3 utilizing putObject() technique.

s3.putObject(testBucketName, testFileKey,knowledge)

Right here is the ultimate script.

import exec from "k6/execution";

import { AWSConfig, S3Client } from "https://jslib.k6.io/aws/0.3.0/s3.js";

const awsConfig = new AWSConfig(
  __ENV.AWS_REGION,
  __ENV.AWS_ACCESS_KEY_ID,
  __ENV.AWS_SECRET_ACCESS_KEY
);

const s3 = new S3Client(awsConfig);

const knowledge = open("check.txt", "r");
const testBucketName = "k6test";
const testFileKey = "check.txt";

// fundamental operate
export default operate () {
  const buckets = s3.listBuckets();


  if (buckets.filter((bucket) => bucket.title === testBucketName).size == 0) {
    exec.check.abort();
  }

  s3.putObject(testBucketName, testFileKey, knowledge);
  console.log("Uploaded " + testFileKey + " to S3");

}

Save the above script and execute the under command.

k6 run -e AWS_REGION=ZZ-ZZZZ-Z -e AWS_ACCESS_KEY_ID=XXXXXXXXXXXXXXX -e AWS_SECRET_ACCESS_KEY=YYYYYYYYYYYYYYYYY uploadFiletoS3.js

To retailer the variables in PowerShell, you should use the under command e.g.

Set-Variable -Title "AWS_REGION" -Worth "us-east-2"

To execute, you should use the under command.

k6 run -e AWS_REGION=$AWS_REGION -e AWS_ACCESS_KEY_ID=$AWS_ACCESS_KEY_ID -e AWS_SECRET_ACCESS_KEY=$AWS_SECRET_ACCESS_KEY uploadFiletoS3.js

Navigate to the S3 console and go to the bucket to test the file object. You’ll be able to obtain the file to confirm the contents.

S3 Validation

Congratulations! You efficiently uploaded the file to S3. If you need to delete the file object, use the under code snippet.

s3.deleteObject(testBucketName, testFileKey)

To learn the content material from S3 bucket, you should use the under snippet.

const fileContent = s3.getObject(testBucketName, testFileKey);
console.log(fileContent.knowledge);

Closing Ideas

The k6 AWS library is neatly designed with often used AWS companies and strategies. Proper now, it helps S3 Shopper, Secret Supervisor Shopper, and AWS Config. Hopefully, the k6 workforce will add extra companies which can assist builders and efficiency engineers.

Add a Comment

Your email address will not be published. Required fields are marked *