multicloud365
  • Home
  • Cloud Architecture
    • OCI
    • GCP
    • Azure
    • AWS
    • IAC
    • Cloud Networking
    • Cloud Trends and Innovations
    • Cloud Security
    • Cloud Platforms
  • Data Management
  • DevOps and Automation
    • Tutorials and How-Tos
  • Case Studies and Industry Insights
    • AI and Machine Learning in the Cloud
No Result
View All Result
  • Home
  • Cloud Architecture
    • OCI
    • GCP
    • Azure
    • AWS
    • IAC
    • Cloud Networking
    • Cloud Trends and Innovations
    • Cloud Security
    • Cloud Platforms
  • Data Management
  • DevOps and Automation
    • Tutorials and How-Tos
  • Case Studies and Industry Insights
    • AI and Machine Learning in the Cloud
No Result
View All Result
multicloud365
No Result
View All Result

Migrating to AWS JavaScript SDK v3: Classes Realized

admin by admin
May 9, 2025
in AWS
0
Migrating to AWS JavaScript SDK v3: Classes Realized
399
SHARES
2.3k
VIEWS
Share on FacebookShare on Twitter


There’s work coming your manner! Node.js 16 reached end-of-life on September eleventh, 2023. Additionally, the AWS Lambda runtime setting for Node.js 18 upgraded to v3 of the AWS SDK for JavaScript. So to improve Lambda capabilities from Node.js 16 to 18, you must migrate to AWS JavaScript SDK to v3 as properly. Sadly, v3 will not be backward appropriate with v2. Within the following, I’ll share what I stumbled upon whereas upgrading many Lambda capabilities to v3.

Migrating to AWS JavaScript SDK v3: Lessons Learned

When upgrading the AWS JavaScript SDK from v2 to v3, it’s best to bookmark the next pages:

Import and Consumer

Step one is to import the SDK and initialize a shopper.

Previous (v2)

v2 offered CommonJS modules solely. This was how you can import the SDK, the SQS shopper on this instance.

const AWS = require('aws-sdk');
const sqs = new AWS.SQS({apiVersion: '2012-11-05'});

New (v3)

With v3, there are two choices to import the SDK. Right here is how you can import the SQS shopper utilizing ES modules.

Native JavaScript modules, or ES modules, are the trendy strategy to separate JavaScript applications into separate modules. Be taught extra!

import { SQSClient } from '@aws-sdk/client-sqs';
const sqs = new SQSClient({apiVersion: '2012-11-05'});

By default, Lambda capabilities use CommonJS modules. To make use of ES modules, use the file suffix .mjs as a substitute of .js or set sort to module within the bundle.json. Be taught extra!

In case you need to persist with CommonJS modules to keep away from having to rewrite bigger elements of your code, that is how you can import the SQS shopper, for instance.

const { SQSClient } = require('@aws-sdk/client-sqs');
const sqs = new SQSClient({apiVersion: '2012-11-05'});

Instructions as a substitute of strategies

AWS determined to make use of a command-style strategy for v3 of the AWS JS SDK. So, it’s sending instructions as a substitute of calling strategies. Sadly, this requires to rewrite a variety of code.

Previous (v2)

As an alternative of calling listContainerInstances(...) …

const AWS = require('aws-sdk');
const ecs = new AWS.ECS({apiVersion: '2014-11-13'});

ecs.listContainerInstances({
cluster: 'demo',
standing: 'ACTIVE'
});

New (v3)

… ship a ListContainerInstancesCommand command. Fortunately, the parameters keep the identical.

const { ECSClient, ListContainerInstancesCommand } = require('@aws-sdk/client-ecs');
const ecs = new ECSClient({apiVersion: '2014-11-13'});

ecs.ship(new ListContainerInstancesCommand({
cluster: 'demo',
standing: 'ACTIVE'
}));

Promise

The right way to look ahead to outcomes from AWS? I want utilizing guarantees with the assistance of the async/await syntax.

Previous (v2)

v2 makes use of callbacks by default. Due to this fact, it was essential to append promise() to each technique name.

const AWS = require('aws-sdk');
const s3 = new AWS.S3({apiVersion: '2006-03-01'});

operate async handler() {
await s3.getObject({
Bucket: 'demo',
Key: 'hey.txt'
}).promise();
}

New (v3)

v3 makes use of guarantees by default.

const { S3Client, GetObjectCommand } = require('@aws-sdk/client-s3');
const s3 = new S3Client({apiVersion: '2006-03-01'});

operate async handler() {
await s3.ship(new GetObjectCommand({
Bucket: 'demo',
Key: 'hey.txt'
}));
}

Callback

Do you like callbacks? Or do you need to keep away from rewriting code?

Previous (v2)

As talked about above, v2 defaults to callbacks.

const AWS = require('aws-sdk');
const iam = new AWS.IAM({apiVersion: '2010-05-08'});

iam.deleteAccountPasswordPolicy({}, (res, err) => {
if (err) {
console.log(err);
}
});

New (v3)

However utilizing callbacks is kind of easy with v3 as properly. The ship(...) technique accepts a callback operate because the 2nd parameter.

const { IAMClient, DeleteAccountPasswordPolicyCommand } = require('@aws-sdk/client-iam');

iam.ship(new DeleteAccountPasswordPolicyCommand({}), (res, err) => {
if (err) {
console.log(err);
}
});

Error dealing with

When issues go mistaken, dealing with errors is vital.

Previous (v2)

The code property of the error contains the error code.

const AWS = require('aws-sdk');
const s3 = new AWS.S3({apiVersion: '2006-03-01'});

strive {
await s3.getObject({
Bucket: 'demo',
Key: 'hey.txt'
}).promise();
} catch (err) {
if (err.code === 'NoSuchKey') {

}
}

New (v3)

With v3 use the identify property of the error.

const { S3Client, GetObjectCommand } = require('@aws-sdk/client-s3');
const s3 = new S3Client({apiVersion: '2006-03-01'});

strive {
await s3.ship(new GetObjectCommand({
Bucket: 'demo',
Key: 'hey.txt'
}));
} catch (err) {
if (err.identify === 'NoSuchKey') {

}
}

S3 multi-part add

Splitting massive information into a number of elements when importing them to S3 is important to enhance efficiency.

Previous (v2)

The S3 shopper shipped with the high-level technique add(...), which handles multi-part uploads.

const AWS = require('aws-sdk');
const s3 = new AWS.S3({apiVersion: '2006-03-01'});

await s3.add({
Bucket: 'demo',
Key: 'heavy.tar'
Physique: physique
}).promise();

New (v3)

AWS moved that performance from the S3 shopper to a separate module with v3.

const { S3Client } = require('@aws-sdk/client-s3');
const { Add } = require('@aws-sdk/lib-storage');
const s3 = new S3Client({apiVersion: '2006-03-01'});

await new Add({
shopper: s3,
params: {
Bucket: 'demo',
Key: 'heavy.tar'
Physique: physique
}
}).achieved();

The AWS JavaScript SDK v3 does nonetheless not help parallel byte-range fetches. Try widdix/s3-getobject-accelerator to speed up fetching objects from S3.

Streaming S3 outcomes

When coping with massive information on S3, maintaining them in reminiscence will not be an choice. Make use of streams as a substitute.

The next examples present how you can obtain, remodel, and add an object.

Previous (v2)

The createReadStream(...) technique permits piping an object saved on S3 right into a stream.

const zlib = require('zlib');
const stream = require('stream');
const AWS = require('aws-sdk');
const s3 = new AWS.S3({apiVersion: '2006-03-01'});

const physique = stream.pipeline(
s3.getObject({
Bucket: 'demo',
Key: 'hey.txt'
}).createReadStream(),
zlib.createGzip(),
() => {}
);

await s3.add({
Bucket: 'demo',
Key: 'hey.txt.gz'
Physique: physique
}).promise();

New (v3)

With v3 the Physique property of the GetObjectCommand, PutObjectCommand in addition to the Add performance (see above) return or settle for streams out-of-the-box.

const zlib = require('node:zlib');
const { pipeline, Rework } = require('node:stream');
const { S3Client, GetObjectCommand } = require('@aws-sdk/client-s3');
const { Add } = require('@aws-sdk/lib-storage');
const s3 = new S3Client({apiVersion: '2006-03-01'});

const getObjectResponse = await s3.ship(new GetObjectCommand({
Bucket: 'demo',
Key: 'hey.txt'
}));

const bodyPipeline = pipeline(
getObjectResponse.Physique,
zlib.createGzip(),
() => {}
);

await new Add({
shopper: s3,
params: {
Bucket: 'demo',
Key: 'hey.txt.gz'
Physique: bodyPipeline
}
}).achieved();

Abstract

Attributable to breaking modifications between v2 and v3 of the AWS JavaScript SDK, migrating incurs a variety of work. However there isn’t any manner out. AWS plans to deprecate v2 quickly. Additionally, the Node.js 18 setting for Lambda does ship with v3 solely.

Tags: AWSJavaScriptLearnedLessonsMigratingSDK
Previous Post

Time Collection Forecasting Made Easy (Half 2): Customizing Baseline Fashions

Next Post

Prime Purposes of Generative AI Remodeling Industries

Next Post
What’s SIEM? Safety Info and Occasion Administration Defined

Prime Purposes of Generative AI Remodeling Industries

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Trending

5 Tricks to Migrate from MongoDB to Amazon DocumentDB

5 Tricks to Migrate from MongoDB to Amazon DocumentDB

March 29, 2025
Cloud Administrator Wage: What to Anticipate in 2024

Cloud Administrator Wage: What to Anticipate in 2024

April 29, 2025

Configuring and Allow Database Vault for CDB and PDB

April 24, 2025
Market Momentum for Storage-Agnostic Unstructured Information Administration – Komprise

Market Momentum for Storage-Agnostic Unstructured Information Administration – Komprise

April 5, 2025
The Inexperienced Cloud: Leveraging AI for Sustainable Cloud Computing Practices

The Inexperienced Cloud: Leveraging AI for Sustainable Cloud Computing Practices

February 3, 2025
Cloud adoption will fail due to the talents hole

Cloud adoption will fail due to the talents hole

February 1, 2025

MultiCloud365

Welcome to MultiCloud365 — your go-to resource for all things cloud! Our mission is to empower IT professionals, developers, and businesses with the knowledge and tools to navigate the ever-evolving landscape of cloud technology.

Category

  • AI and Machine Learning in the Cloud
  • AWS
  • Azure
  • Case Studies and Industry Insights
  • Cloud Architecture
  • Cloud Networking
  • Cloud Platforms
  • Cloud Security
  • Cloud Trends and Innovations
  • Data Management
  • DevOps and Automation
  • GCP
  • IAC
  • OCI

Recent News

Safe & Environment friendly File Dealing with in Spring Boot: Learn, Write, Compress, and Defend | by Rishi | Mar, 2025

Safe & Environment friendly File Dealing with in Spring Boot: Learn, Write, Compress, and Defend | by Rishi | Mar, 2025

May 15, 2025
Bitwarden vs Dashlane: Evaluating Password Managers

Bitwarden vs Dashlane: Evaluating Password Managers

May 15, 2025
  • About Us
  • Privacy Policy
  • Disclaimer
  • Contact

© 2025- https://multicloud365.com/ - All Rights Reserved

No Result
View All Result
  • Home
  • Cloud Architecture
    • OCI
    • GCP
    • Azure
    • AWS
    • IAC
    • Cloud Networking
    • Cloud Trends and Innovations
    • Cloud Security
    • Cloud Platforms
  • Data Management
  • DevOps and Automation
    • Tutorials and How-Tos
  • Case Studies and Industry Insights
    • AI and Machine Learning in the Cloud

© 2025- https://multicloud365.com/ - All Rights Reserved