Wednesday, 10 September 2025

Snowflake - Cost Optimization

  1. Reduce auto-suspend to 60 seconds
  2. Reduce virtual warehouse size
  3. Ensure minimum clusters are set to 1
  4. Consolidate warehouses
    • Separate warehouse by workload, requirement & not by domain
  5. Reduce query frequency
    • At many organizations, batch data transformation jobs often run hourly by default. But do downstream use cases need such low latency? Check with business before set up the frequency.
  6. Only process new or updated data
  7. Ensure tables are clustered correctly
  8. Drop unused tables
  9. Lower data retention
    • The time travel (data retention) setting can result in added costs since it must maintain copies of all modifications and changes to a table made over the retention period.
  10. Use transient tables
  11. Avoid frequent DML operations
  12. Ensure files are optimally sized
    • To ensure cost effective data loading, a best practice is to keep your files around 100-250MB. 
    • To demonstrate these effects, 
      • If we only have one 1GB file, we will only saturate 1/16 threads on a Small warehouse used for loading. 
      • If you instead split this file into ten files that are 100 MB each, you will utilize 10 threads out of 16. This level parallelization is much better as it leads to better utilisation of the given compute resources
  13. Leverage access control
  14. Enable query timeouts
  15. Configure resource monitors

No comments:

Snowflake - Cost Optimization

Reduce auto-suspend to 60 seconds Reduce virtual warehouse size Ensure minimum clusters are set to 1 Consolidate warehouses Separate warehou...