Skip to content

shladyr/python.finops.monitoring-logs-script

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

FinOps-Monitoring-Logs-Cost

Title: FinOps Best Practices for Optimizing Logs Cost in DataDog Cloud

Introduction: In today's cloud-centric landscape, efficient cost management practices are crucial for organizations leveraging powerful tools like DataDog to gain insights into their applications through logs. This article aims to highlight FinOps (Financial Operations) best practices to help your organization optimize logs cost in DataDog Cloud, ensuring that you get the most value from your investment.

  1. Level-based Log Forwarding: DataDog provides the flexibility to forward logs to various destinations based on log levels. By strategically configuring log forwarding, you can ensure that only essential logs are stored and processed. In your case, forwarding logs with the INFO level to an AWS S3 bucket is a good practice. This prevents unnecessary storage and processing costs associated with lower-priority logs.

  2. Exclusion Filters for Indexed Logs: Utilize exclusion filters effectively to keep only logs of higher significance in your indexed logs. In your configuration, you have wisely chosen to retain logs with ERROR and WARNING levels. This ensures that you focus on critical issues, allowing for efficient troubleshooting and monitoring without the noise of less severe log entries.

  3. Fine-tune Exclusion Filters: Continuously review and fine-tune your exclusion filters based on changing application behavior and business requirements. Regularly assess the necessity of excluding specific log levels or patterns to strike the right balance between visibility and cost.

  4. Monitor Usage Metrics: Set up and monitor the "datadog.estimated_usage.logs.ingested_events" metric to keep track of your log ingestion. This metric provides insights into your log data volume, helping you stay informed about your usage patterns. Implement alerts to be notified of unexpected spikes or drops in log ingestion, enabling proactive cost management.

  5. Periodic Review of Monitors: Regularly review and adjust your cost-related monitors to align with changing business priorities and application behavior. Ensure that your monitoring strategy aligns with the actual impact on your operations and addresses potential bottlenecks in resource consumption.

  6. Archive Old Logs: Implement a log retention policy that identifies and archives older logs to a cost-effective storage solution. Infrequently accessed logs can be moved to a lower-tier storage option, such as Amazon Glacier, to reduce ongoing costs while still maintaining compliance with data retention requirements.

  7. Implement DataDog Features Efficiently: Explore and leverage specific DataDog features that can help optimize logs cost, such as log processing rules, custom parsing, and log sampling. Customizing your configuration based on the unique needs of your applications can lead to more efficient log management.

Conclusion: Effective FinOps practices are essential for managing costs in a cloud-centric environment. By implementing the above practices, your organization can strike a balance between visibility and cost, ensuring that logs are both comprehensive and cost-effective. Regularly review and refine your log management strategy to adapt to the evolving needs of your applications and business, ultimately maximizing the value derived from DataDog Cloud.

Diagrams

_finops_1.png

_finops_2.png

About

FinOps: Monitoring Logs Cost

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages