前言
刚加入了新项目团队不久,团队目前还在用着老旧的Lottie-2.5.3的版别(终究一个OC的版别,之后的版别已经切换到了Swift,当然本次发现问题和OC或许Swift版别关系不大)在一次正常的版别开发过程中,偶尔发现了在一个主页主动切换播放声音卡片场景下,放着不动的情况下,内存的占用会不停的添加直至OOM。终究问题定位到是因为Lottie源码的原因导致的,下面是代码剖析。
代码剖析
通过一系列绵长的剖析,终究发现是Lottie源码LOTLayerContainer类里边关于图片解码这块的问题导致的。下面是没修正前Lottie对图片解码的源码;
UIImage *image;
if ([asset.imageName hasPrefix:@"data:"]) {
// Contents look like a data: URL. Ignore asset.imageDirectory and simply load the image directly.
NSURL *imageUrl = [NSURL URLWithString:asset.imageName];
NSData *imageData = [NSData dataWithContentsOfURL:imageUrl];
image = [UIImage imageWithData:imageData];
} else if (asset.rootDirectory.length > 0) {
NSString *rootDirectory = asset.rootDirectory;
if (asset.imageDirectory.length > 0) {
rootDirectory = [rootDirectory stringByAppendingPathComponent:asset.imageDirectory];
}
NSString *imagePath = [rootDirectory stringByAppendingPathComponent:asset.imageName];
id<LOTImageCache> imageCache = [LOTCacheProvider imageCache];
if (imageCache) {
image = [imageCache imageForKey:imagePath];
if (!image) {
image = [UIImage imageWithContentsOfFile:imagePath];
[imageCache setImage:image forKey:imagePath];
}
} else {
image = [UIImage imageWithContentsOfFile:imagePath];
}
} else {
NSString *imagePath = [asset.assetBundle pathForResource:asset.imageName ofType:nil];
image = [UIImage imageWithContentsOfFile:imagePath];
}
这儿首要有两个问题:
1、Lottie虽然有LOTImageCache这个接口类去处理图片的缓存,但是默许是没有完成的,也就是说默许是没有缓存的,而且关于正常运用者来说很难发现它有缓存设计,并且去完成它。PS:Swift版别之后的Lottie直接在init办法里边就支持CustomImageProvider,估量就是考虑到了这个问题。后面完成了ImageCache接口后内存直接下降30%
2、图片的解码直接运用了[UIImage imageWithContentsOfFile:imagePath]这个体系办法。实话说这个体系办法效率着实有点低,一来它是显现到屏幕再解码(不会提早解码BitMap),二来它是全量解码假如图片Data太大,内存占用十分高。所以咱们经常看到Lottie播放一个动画瞬间增加100多m内存,然后完毕释放,也是这个原因。
解决计划
既然咱们找到原因了,那解决计划就十分简单了。
首先咱们完成LOTImageCache的接口,并且在load办法直接设置一个ImageCache。新建一个XXXImageCache类完成这个接口,代码如下:
@interface XXXLOTImageCache()<LOTImageCache>
@property (nonatomic, strong) NSCache *imageCache;
@end
@implementation XXXLOTImageCache
+ (void)load {
[LOTCacheProvider setImageCache:[[XXXLOTImageCache alloc] init]];
}
- (instancetype)init {
self = [super init];
if (self) {
_imageCache = [[NSCache alloc] init];
CGFloat memoryGBUnit = ceil([OPSDevice memoryTotal] / 1024.0);
if (memoryGBUnit > 6) {
memoryGBUnit = 6;
}
//50 per GB Memory
_imageCache.countLimit = 50 * memoryGBUnit;
//Unit 50 mb per GB Memory
_imageCache.totalCostLimit = 1024 * 1024 * (50 * memoryGBUnit);
[[NSNotificationCenter defaultCenter] addObserver:self
selector:@selector(didReceiveMemoryWarning:)
name:UIApplicationDidReceiveMemoryWarningNotification
object:nil];
}
return self;
}
- (void)didReceiveMemoryWarning:(NSNotification *)notification {
[self.imageCache removeAllObjects];
}
- (LOTImage *)imageForKey:(NSString *)key {
return [self.imageCache objectForKey:key];
}
- (void)setImage:(LOTImage *)image forKey:(NSString *)key {
[self.imageCache setObject:image forKey:key];
}
第二点是自完成图片解码替换体系办法,这儿首要选用ImageIO的增量解码计划,以及提早加载BitMap,首要参考了SDWebImageCache的解码思路,作用仍是挺明显的,内存持续下降了20%以上。代码如下,直接替换UIImage的体系办法即可
CGColorSpaceRef GlobalCGColorSpaceGetDeviceRGB(void) {
static CGColorSpaceRef colorSpace;
static dispatch_once_t onceToken;
dispatch_once(&onceToken, ^{
colorSpace = CGColorSpaceCreateDeviceRGB();
});
return colorSpace;
}
BOOL GlobalCGImageRefContainsAlpha(CGImageRef imageRef) {
if (!imageRef) {
return NO;
}
CGImageAlphaInfo alphaInfo = CGImageGetAlphaInfo(imageRef);
BOOL hasAlpha = !(alphaInfo == kCGImageAlphaNone ||
alphaInfo == kCGImageAlphaNoneSkipFirst ||
alphaInfo == kCGImageAlphaNoneSkipLast);
return hasAlpha;
}
+ (UIImage *)decodeImageWithPath:(NSString *)imagePath {
if (![[NSFileManager defaultManager] fileExistsAtPath:imagePath]) {
return nil;
}
@autoreleasepool {
NSData *imageData = [NSData dataWithContentsOfFile:imagePath];
CGImageSourceRef imageSource = CGImageSourceCreateIncremental(NULL);
CGImageSourceUpdateData(imageSource, (__bridge CFDataRef)imageData, YES);
CFDictionaryRef properties = CGImageSourceCopyPropertiesAtIndex(imageSource, 0, NULL);
size_t _width = 0;
size_t _height = 0;
if (properties) {
CFTypeRef val = CFDictionaryGetValue(properties, kCGImagePropertyPixelHeight);
if (val) CFNumberGetValue(val, kCFNumberLongType, &_height);
val = CFDictionaryGetValue(properties, kCGImagePropertyPixelWidth);
if (val) CFNumberGetValue(val, kCFNumberLongType, &_width);
CFRelease(properties);
}
CGImageRef decompressedImageRef = CGImageSourceCreateImageAtIndex(imageSource, 0, NULL);
if ((_width + _height) == 0) {
return nil;
}
UIImage *compressedImage = [UIImage imageWithCGImage:decompressedImageRef];
CGImageRelease(decompressedImageRef);
CFRelease(imageSource);
imageSource = NULL;
compressedImage = [LOTLayerContainer decompressedImageWithImage:compressedImage];
return compressedImage;
}
}
+ (nullable UIImage *)decompressedImageWithImage:(nullable UIImage *)image {
if (!image) {
return image;
}
CGImageRef imageRef = image.CGImage;
// device color space
CGColorSpaceRef colorspaceRef = GlobalCGColorSpaceGetDeviceRGB();
BOOL hasAlpha = GlobalCGImageRefContainsAlpha(imageRef);
// iOS display alpha info (BRGA8888/BGRX8888)
CGBitmapInfo bitmapInfo = kCGBitmapByteOrder32Host;
bitmapInfo |= hasAlpha ? kCGImageAlphaPremultipliedFirst : kCGImageAlphaNoneSkipFirst;
size_t width = CGImageGetWidth(imageRef);
size_t height = CGImageGetHeight(imageRef);
CGContextRef context = CGBitmapContextCreate(NULL,
width,
height,
8,
0,
colorspaceRef,
bitmapInfo);
if (context == NULL) {
return image;
}
// Draw the image into the context and retrieve the new bitmap image without alpha
CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
CGImageRef imageRefWithoutAlpha = CGBitmapContextCreateImage(context);
UIImage *imageWithoutAlpha = [[UIImage alloc] initWithCGImage:imageRefWithoutAlpha scale:image.scale orientation:image.imageOrientation];
CGContextRelease(context);
CGImageRelease(imageRefWithoutAlpha);
return imageWithoutAlpha;
}
成果
优化前后对比,内存占用不管是启动仍是进房间的场景,直接下降了50%以上,作用仍是十分明显的。
结语
以上就是这次优化Lottie完成的整个过程,假如你也是用的OC版的Lottie赶紧行动起来吧,把源码拖到项目里边,简单改改我上面提到的点就能到达作用。假如你用的是Swift版别,那么也能够运用上面的思路,自完成一个Custom的ImageProvider,然后封装一个通用的Lottie办法就行啦。