Grab's TLRU: Smarter Image Caching for Android

Alps Wang

Alps Wang

Mar 15, 2026 · 1 views

Beyond LRU: Time-Aware Cache Efficiency

Grab's implementation of a Time-Aware Least Recently Used (TLRU) cache for their Android application is a compelling case study in optimizing mobile resource management. The core innovation lies in extending the traditional LRU by incorporating time-based eviction, directly addressing the dual problem of cache bloat and premature eviction of still-relevant but less recently accessed data. This intelligent approach not only reclaims significant storage space across a vast user base but does so without compromising the user experience or incurring additional server costs, a critical balance for large-scale applications. The decision to fork and extend Glide's DiskLruCache is a pragmatic one, leveraging a robust, battle-tested foundation rather than reinventing the wheel, which is a common and often wise strategy in complex software development. The detailed explanation of the three key parameters—TTL, minimum cache size threshold, and maximum cache size—provides a clear blueprint for understanding TLRU's operational mechanics. The challenges encountered, such as persisting last-access times and migrating existing caches, are realistically presented, highlighting the practical hurdles in implementing such a system. The success metric of a mere 3 percentage point decrease in cache hit ratio during the transition is a strong indicator of the effectiveness of their tuning and the inherent value of the TLRU approach. This article is a valuable contribution to the mobile development community, offering actionable insights into improving a fundamental aspect of app performance.

While the article effectively showcases the benefits and implementation of TLRU, a deeper dive into the 'controlled experiments' used to find optimal configuration values could further enhance its value. Understanding the specific metrics beyond cache hit ratio that were monitored (e.g., app startup time, image load times, memory usage) and the methodology of these experiments would offer more granular guidance for other teams. Additionally, exploring potential trade-offs in terms of increased complexity in cache management logic and the potential for slightly higher CPU overhead during cache access due to the time-based checks would provide a more complete picture. The decision to assign the migration timestamp to all entries, while pragmatic for immediate data preservation, does mean that the full benefits of time-based eviction are delayed by one TTL period for all existing content. This is a noted trade-off, but understanding the typical TTL values used and the expected duration of this initial 'grace period' would be beneficial. Despite these minor points, the overall impact and ingenuity of Grab's TLRU implementation, especially its successful application in a high-traffic, resource-constrained mobile environment, make this a highly relevant and informative piece for any developer concerned with efficient caching strategies.

Key Points

  • Grab optimized its Android app's image caching by transitioning from LRU to Time-Aware LRU (TLRU).
  • TLRU addresses shortcomings of LRU, such as rapid cache filling and wasted storage by keeping expired items.
  • Key TLRU parameters include Time To Live (TTL), minimum cache size threshold, and maximum cache size.
  • Grab forked and extended Glide's DiskLruCache for their TLRU implementation, leveraging its robust foundation.
  • Implementation involved tracking last-access time, time-based eviction logic, and a migration mechanism for existing caches.
  • A successful transition maintained cache hit ratio within acceptable limits (max 3pp decrease), reclaiming significant storage (50MB for 95% of users).
  • This optimization reduces server costs and improves user experience by managing storage more effectively.

Article Image


📖 Source: How Grab Optimizes Image Caching on Android with Time-Aware LRU

Related Articles

Comments (0)

No comments yet. Be the first to comment!