Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve Performance and Resource Allocation behaviour #51

Open
6 of 7 tasks
ccwienk opened this issue May 29, 2024 · 0 comments
Open
6 of 7 tasks

Improve Performance and Resource Allocation behaviour #51

ccwienk opened this issue May 29, 2024 · 0 comments

Comments

@ccwienk
Copy link
Collaborator

ccwienk commented May 29, 2024

Context / Motivation

Delivery Gear is expected to be used with different load and load patterns. Ranging from small instances with few artefacts to process and few concurrent users to larger installations. Also, periodic scans / version-updates will in either case cause load bursts.

Therefore, there are some optimisations we should pursue:

Caching

Limit (local) in-memory caching towards a configurable maximum allowed memory size. Depending on the cached data, decide whether in-memory-cache or local filesystem cache is more adequate. In doubt, prefer filesystem-cache (maybe in conjunction with pickle rather than more expensive x-serialisation via yaml/json + dacite).

Use centralised/shared cache to avoid cache-loss / redundant caching between multiple pods.

Implement means for explicit cache invalidation (probably via api-route).

Resource Allocation / Auto-Scaling

Monitor actual load (amount of parallel requests, request waiting time, CPU-consumption/machine-load), use load metrics for autoscaling using k8s means (within configurable boundaries). Consider Delivery-Service separately from extensions.

async (ASGI)

Thoroughly investigate switching to async/ASGI. Specifically analyse:

Monitoring / Metric-Export

Configure exporting of metrics to determine current workloads and bottlenecks. This information can and should be used to properly configure caching afterwards.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

2 participants