Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Global client #7

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

valeriomazzeo
Copy link

@valeriomazzeo valeriomazzeo commented May 24, 2019

Both cache and rateLimit options are worthless if the client is recreated for each lambda execution.

Due to the stateless nature of lambda functions, this is not able to ensure proper caching, but at least on hot starts the client will be reused for each lambda container which is better than nothing.

Am I missing anything? Is the client cache somehow working even when recreated each time?

Both `cache` and `rateLimit` options are worthless if the client is recreated for each lambda execution.

Due to the stateless natura of lambda functions, this is not able to ensure proper caching, but at least on hot starts the client will be reused for each lambda container which is better than nothing.

Am I missing anything? Is the client cache somehow working even when recreated each time?
@agustin-vieta
Copy link

@valeriomazzeo
Some form of caching can be achieved via AWS Lambda execution context:

After a Lambda function is executed, AWS Lambda maintains the execution context for some time in anticipation of another Lambda function invocation.

To use that, your variables must be declared outside of the function's handler:

Objects declared outside of the function's handler method remain initialized, providing additional optimization when the function is invoked again. For example, if your Lambda function establishes a database connection, instead of reestablishing the connection, the original connection is used in subsequent invocations.

However, the problem with using the execution context is that AWS does not provide any guarantee on how long you can use it:

When you write your Lambda function code, do not assume that AWS Lambda automatically reuses the execution context for subsequent function invocations.

Thus, the caching solution based on lambda execution context could work, but it would be limited in many ways. For proper caching solutions you would have to look outside of lambda, for example store your serialized objects in ElastiCache, DynamoDB or more.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants