A production-grade memory profiling service for iOS applications that provides real-time memory monitoring, leak detection, and memory usage analytics.
- Real-time monitoring using
mach_task_basic_info - Device-aware thresholds (70% of device RAM by default)
- Leak detection for ViewModels, Services, etc.
- Configurable runtime - enable/disable at runtime
- Environment configuration - can be configured to run in debug-only or all builds. No
#if DEBUGneeded, it has automatic conditional compilation.
@main
class AppDelegate: UIResponder, UIApplicationDelegate {
private var memoryProfilerService = MemoryProfilerService()
func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
memoryProfilerService.startMonitoring()
return true
}
}// Initialize with custom settings
let profiler = MemoryProfilerService(
isEnabled: true, // Enable the service (default: true)
environment: .debugOnly // Only run in debug builds (default: .debugOnly)
)
// Runtime control
profiler.disable() // Turn off monitoring and clear timers
profiler.enable() // Turn on monitoring
// Check status
if profiler.isServiceEnabled {
profiler.logMemoryUsage(context: "Active monitoring")
}
print("Environment: \(profiler.environment)")final class MyViewModel: ObservableLoggableObject {
private var memoryProfilerService = MemoryProfilerService()
func loadData() {
memoryProfilerService.logMemoryUsage(context: "Before loading data")
Task {
do {
let data = try await apiClient.request(endpoint)
memoryProfilerService.logMemoryUsage(context: "After loading data")
} catch {
// Handle error
}
}
}
}func processLargeImages() {
memoryProfilerService.logMemoryUsage(context: "Before image processing")
// Your heavy image processing
for image in largeImageArray {
processImage(image)
}
memoryProfilerService.logMemoryUsage(context: "After image processing")
}// Set custom threshold (default is 70% of device RAM)
memoryProfilerService.setMemoryWarningThreshold(1024 * 1024 * 1024) // 1GBlet stats = memoryProfilerService.getMemoryStats()
print("Used: \(stats.usedMemory / 1024 / 1024)MB")
print("Available: \(stats.availableMemory / 1024 / 1024)MB")
print("Total: \(stats.totalMemory / 1024 / 1024)MB")
print("Usage: \(String(format: "%.2f", stats.memoryUsagePercentage))%")- Monitor memory usage during feature development
- Check for leaks after adding new ViewModels or Services
- Verify memory cleanup in
deinitmethods
- Before/after heavy operations (image processing, file uploads)
- During network requests with large payloads
- When implementing caching mechanisms
- Before/after refactoring to measure improvements
- When implementing lazy loading
- After adding new dependencies
- When app crashes with memory warnings
- When performance feels sluggish
- When investigating memory-related bugs
π§ Memory Profiler: Started monitoring
π§ Memory warning threshold set to 5376MB (70% of 7680MB total)
π§ Memory usage: 45.23% (used: 3456MB, available: 4224MB, total: 7680MB)
β οΈ WARNING: Memory usage exceeded threshold: 5500MB > 5376MB
β οΈ Received system memory warning!
π§ Periodic memory check: 3200MB used
π§ Memory usage: 41.67% (used: 3200MB, available: 4480MB, total: 7680MB)
The service can be used directly without dependency injection:
// Direct instantiation
private var memoryProfilerService = MemoryProfilerService()// β
Good - Log before/after heavy operations
func uploadImages() {
memoryProfilerService.logMemoryUsage(context: "Before upload")
// Upload logic
memoryProfilerService.logMemoryUsage(context: "After upload")
}
// β Avoid - Logging too frequently
func everyMethod() {
memoryProfilerService.logMemoryUsage() // Too much noise
}func loadContacts() {
memoryProfilerService.logMemoryUsage(context: "Before loading contacts")
Task {
let contacts = try await apiClient.request(APIEndpoint.fetchContacts)
await MainActor.run {
self.contacts = contacts
memoryProfilerService.logMemoryUsage(context: "After loading contacts")
}
}
}- Check if ViewModels are being deallocated properly
- Look for retain cycles in closures
- Verify
cancellables.removeAll()indeinit - Check for large image caches
- Ensure
deinitmethods are called - Check for strong reference cycles
- Verify task cancellation in ViewModels
- Look for unclosed network connections
- Monitor memory during heavy operations
- Check for memory spikes during image processing
- Verify memory cleanup after operations
let stats = memoryProfilerService.getMemoryStats()
if stats.memoryUsagePercentage > 80 {
logWarning("High memory usage detected: \(stats.memoryUsagePercentage)%")
}
if stats.usedMemory > stats.peakMemoryUsage * 0.9 {
logWarning("Approaching peak memory usage")
}let leaks = memoryProfilerService.detectMemoryLeaks()
for leak in leaks {
logError("Potential leak: \(leak.objectType) - \(leak.objectCount) objects")
}The Memory Profiler SDK provides enterprise-grade memory monitoring with:
- β Zero production impact (debug-only by default)
- β Runtime configuration (enable/disable at runtime)
- β No #if DEBUG needed (automatic conditional compilation)
- β Real system APIs (accurate memory data)
- β Device-aware thresholds (70% of RAM)
- β Easy integration (DI-ready)
Use it to ensure your app never crashes due to memory issues and maintains optimal performance! π