引起UITableView卡頓比較常見的原因有cell的層級過多、cell中有觸發離屏渲染的代碼(譬如:cornerRadius、maskToBounds 同時使用)、像素是否對齊、是否使用UITableView自動計算cell高度的方法等。本文將從cell層級出發,以一個仿朋友圈的demo來講述如何讓列表保持順滑,項目的源碼可在文末獲得。不可否認的是,過早的優化是魔鬼,請在項目出現性能瓶頸再考慮優化。
首先看看reveal上頁面層級的效果圖
然後是9.3系統iPhone5的真機效果
1、繪制文本
使用core text可以將文本繪制在一個CGContextRef上,最後再通過UIGraphicsGetImageFromCurrentImageContext()生成圖片,再將圖片賦值給cell.contentView.layer,從而達到減少cell層級的目的。
繪制普通文本(譬如用戶昵稱)在context上,相關注釋在代碼裡:
- (void)drawInContext:(CGContextRef)context withPosition:(CGPoint)p andFont:(UIFont *)font andTextColor:(UIColor *)color andHeight:(float)height andWidth:(float)width lineBreakMode:(CTLineBreakMode)lineBreakMode { CGSize size = CGSizeMake(width, height); // 翻轉坐標系 CGContextSetTextMatrix(context,CGAffineTransformIdentity); CGContextTranslateCTM(context,0,height); CGContextScaleCTM(context,1.0,-1.0); NSMutableDictionary * attributes = [StringAttributes attributeFont:font andTextColor:color lineBreakMode:lineBreakMode]; // 創建繪制區域(路徑) CGMutablePathRef path = CGPathCreateMutable(); CGPathAddRect(path,NULL,CGRectMake(p.x, height-p.y-size.height,(size.width),(size.height))); // 創建AttributedString NSMutableAttributedString *attributedStr = [[NSMutableAttributedString alloc] initWithString:self attributes:attributes]; CFAttributedStringRef attributedString = (__bridge CFAttributedStringRef)attributedStr; // 繪制frame CTFramesetterRef framesetter = CTFramesetterCreateWithAttributedString((CFAttributedStringRef)attributedString); CTFrameRef ctframe = CTFramesetterCreateFrame(framesetter, CFRangeMake(0,0),path,NULL); CTFrameDraw(ctframe,context); CGPathRelease(path); CFRelease(framesetter); CFRelease(ctframe); [[attributedStr mutableString] setString:@""]; CGContextSetTextMatrix(context,CGAffineTransformIdentity); CGContextTranslateCTM(context,0, height); CGContextScaleCTM(context,1.0,-1.0); }
繪制朋友圈內容文本(帶鏈接)在context上,這裡我還沒有去實現文本多了會折疊的效果,與上面普通文本不同的是這裡需要創建帶鏈接的AttributeString和CTLineRef的逐行繪制:
- (NSMutableAttributedString *)highlightText:(NSMutableAttributedString *)coloredString{ // 創建帶高亮的AttributedString NSString* string = coloredString.string; NSRange range = NSMakeRange(0,[string length]); NSDataDetector *linkDetector = [NSDataDetector dataDetectorWithTypes:NSTextCheckingTypeLink error:nil]; NSArray *matches = [linkDetector matchesInString:string options:0 range:range]; for(NSTextCheckingResult* match in matches) { [self.ranges addObject:NSStringFromRange(match.range)]; UIColor *highlightColor = UIColorFromRGB(0x297bc1); [coloredString addAttribute:(NSString*)kCTForegroundColorAttributeName value:(id)highlightColor.CGColor range:match.range]; } return coloredString; } - (void)drawFramesetter:(CTFramesetterRef)framesetter attributedString:(NSAttributedString *)attributedString textRange:(CFRange)textRange inRect:(CGRect)rect context:(CGContextRef)c { CGMutablePathRef path = CGPathCreateMutable(); CGPathAddRect(path, NULL, rect); CTFrameRef frame = CTFramesetterCreateFrame(framesetter, textRange, path, NULL); CGFloat ContentHeight = CGRectGetHeight(rect); CFArrayRef lines = CTFrameGetLines(frame); NSInteger numberOfLines = CFArrayGetCount(lines); CGPoint lineOrigins[numberOfLines]; CTFrameGetLineOrigins(frame, CFRangeMake(0, numberOfLines), lineOrigins); // 遍歷每一行 for (CFIndex lineIndex = 0; lineIndex < numberOfLines; lineIndex++) { CGPoint lineOrigin = lineOrigins[lineIndex]; CTLineRef line = CFArrayGetValueAtIndex(lines, lineIndex); CGFloat descent = 0.0f, ascent = 0.0f, lineLeading = 0.0f; CTLineGetTypographicBounds((CTLineRef)line, &ascent, &descent, &lineLeading); CGFloat penOffset = (CGFloat)CTLineGetPenOffsetForFlush(line, NSTextAlignmentLeft, rect.size.width); CGFloat y = lineOrigin.y - descent - self.font.descender; // 設置每一行位置 CGContextSetTextPosition(c, penOffset + self.xOffset, y - self.yOffset); CTLineDraw(line, c); // CTRunRef同一行中文本的不同樣式,包括顏色、字體等,此處用途為處理鏈接高亮 CFArrayRef runs = CTLineGetGlyphRuns(line); for (int j = 0; j < CFArrayGetCount(runs); j++) { CGFloat runAscent, runDescent, lineLeading1; CTRunRef run = CFArrayGetValueAtIndex(runs, j); NSDictionary *attributes = (__bridge NSDictionary*)CTRunGetAttributes(run); // 判斷是不是鏈接 if (!CGColorEqualToColor((__bridge CGColorRef)([attributes valueForKey:@"CTForegroundColor"]), self.textColor.CGColor)) { CFRange range = CTRunGetStringRange(run); float offset = CTLineGetOffsetForStringIndex(line, range.location, NULL); // 得到鏈接的CGRect CGRect runRect; runRect.size.width = CTRunGetTypographicBounds(run, CFRangeMake(0,0), &runAscent, &runDescent, &lineLeading1); runRect.size.height = self.font.lineHeight; runRect.origin.x = lineOrigin.x + offset+ self.xOffset; runRect.origin.y = lineOrigin.y; runRect.origin.y -= descent + self.yOffset; // 因為坐標系被翻轉,鏈接正常的坐標需要通過CGAffineTransform計算得到 CGAffineTransform transform = CGAffineTransformMakeTranslation(0, ContentHeight); transform = CGAffineTransformScale(transform, 1.f, -1.f); CGRect flipRect = CGRectApplyAffineTransform(runRect, transform); // 保存是鏈接的CGRect NSRange nRange = NSMakeRange(range.location, range.length); self.framesDict[NSStringFromRange(nRange)] = [NSValue valueWithCGRect:flipRect]; // 保存同一條鏈接的不同CGRect,用於點擊時背景色處理 for (NSString *rangeString in self.ranges) { NSRange range = NSRangeFromString(rangeString); if (NSLocationInRange(nRange.location, range)) { NSMutableArray *array = self.relationDict[rangeString]; if (array) { [array addObject:NSStringFromCGRect(flipRect)]; self.relationDict[rangeString] = array; } else { self.relationDict[rangeString] = [NSMutableArray arrayWithObject:NSStringFromCGRect(flipRect)]; } } } } } } CFRelease(frame); CFRelease(path); }
上述方法運用起來就是:
- (void)fillData:(CGContextRef)context { [self.nickname drawInContext:context withPosition:(CGPoint){kTextXOffset, kSpec} andFont:kNicknameFont andTextColor:UIColorFromRGB(0x556c95) andHeight:self.nicknameSize.height andWidth:self.nicknameSize.width lineBreakMode:kCTLineBreakByTruncatingTail]; [self.drawer setText:self.contentString context:context contentSize:self.contentSize backgroundColor:[UIColor whiteColor] font:kContentTextFont textColor:[UIColor blackColor] block:nil xOffset:kTextXOffset yOffset:kSpec * 2 + self.nicknameSize.height]; } - (void)fillContents:(NSArray *)array { UIGraphicsBeginImageContextWithOptions(CGSizeMake(self.size.width, self.size.height), YES, 0); CGContextRef context = UIGraphicsGetCurrentContext(); [UIColorFromRGB(0xffffff) set]; CGContextFillRect(context, CGRectMake(0, 0, self.size.width, self.size.height)); // 獲取需要高亮的鏈接CGRect,並填充背景色 if (array) { for (NSString *string in array) { CGRect rect = CGRectFromString(string); [UIColorFromRGB(0xe5e5e5) set]; CGContextFillRect(context, rect); } } [self fillData:context]; UIImage *temp = UIGraphicsGetImageFromCurrentImageContext(); UIGraphicsEndImageContext(); self.contentView.layer.contents = (__bridge id _Nullable)(temp.CGImage); }
這樣就完成了文本的顯示。
2、顯示圖片
圖片包括用戶頭像和朋友圈的內容,這裡只是將CALayer添加到contentView.layer上,具體做法是繼承了CALayer,實現部分功能。
通過鏈接顯示圖片:
- (void)setContentsWithURLString:(NSString *)urlString { self.contents = (__bridge id _Nullable)([UIImage imageNamed:@"placeholder"].CGImage); @weakify(self) SDWebImageManager *manager = [SDWebImageManager sharedManager]; [manager downloadImageWithURL:[NSURL URLWithString:urlString] options:SDWebImageCacheMemoryOnly progress:nil completed:^(UIImage *image, NSError *error, SDImageCacheType cacheType, BOOL finished, NSURL *imageURL) { if (image) { @strongify(self) dispatch_async(dispatch_get_global_queue(DISPATCH_QUEUE_PRIORITY_DEFAULT, 0), ^{ if (!_observer) { _observer = CFRunLoopObserverCreateWithHandler(kCFAllocatorDefault, kCFRunLoopBeforeWaiting | kCFRunLoopExit, false, POPAnimationApplyRunLoopOrder, ^(CFRunLoopObserverRef observer, CFRunLoopActivity activity) { self.contents = (__bridge id _Nullable)(image.CGImage); }); if (_observer) { CFRunLoopAddObserver(CFRunLoopGetMain(), _observer, kCFRunLoopCommonModes); } } }); self.originImage = image; } }]; }
其他比較簡單就不展開。
3、顯示小視頻
之前的一篇文章簡單講了怎麼自己做一個播放器,這裡就派上用場了。而顯示小視頻封面圖片的CALayer同樣在顯示小視頻的時候可以復用。
這裡使用了NSOperationQueue來保障播放視頻的流暢性,具體繼承NSOperation的VideoDecodeOperation相關代碼如下:
- (void)main { @autoreleasepool { if (self.isCancelled) { _newVideoFrameBlock = nil; _decodeFinishedBlock = nil; return; } AVURLAsset *asset = [AVURLAsset URLAssetWithURL:[[NSURL alloc] initFileURLWithPath:self.filePath] options:nil]; NSError *error; AVAssetReader* reader = [[AVAssetReader alloc] initWithAsset:asset error:&error]; if (error) { return; } NSArray* videoTracks = [asset tracksWithMediaType:AVMediaTypeVideo]; AVAssetTrack* videoTrack = [videoTracks objectAtIndex:0]; // 視頻播放時,m_pixelFormatType=kCVPixelFormatType_32BGRA // 其他用途,如視頻壓縮,m_pixelFormatType=kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange int m_pixelFormatType = kCVPixelFormatType_32BGRA; NSDictionary* options = [NSDictionary dictionaryWithObject:[NSNumber numberWithInt: (int)m_pixelFormatType] forKey:(id)kCVPixelBufferPixelFormatTypeKey]; AVAssetReaderTrackOutput* videoReaderOutput = [[AVAssetReaderTrackOutput alloc] initWithTrack:videoTrack outputSettings:options]; [reader addOutput:videoReaderOutput]; [reader startReading]; // 要確保nominalFrameRate>0,之前出現過android拍的0幀視頻 if (self.isCancelled) { _newVideoFrameBlock = nil; _decodeFinishedBlock = nil; return; } while ([reader status] == AVAssetReaderStatusReading && videoTrack.nominalFrameRate > 0) { if (self.isCancelled) { _newVideoFrameBlock = nil; _decodeFinishedBlock = nil; return; } CMSampleBufferRef sampleBuffer = [videoReaderOutput copyNextSampleBuffer]; CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); // Lock the base address of the pixel buffer CVPixelBufferLockBaseAddress(imageBuffer, 0); // Get the number of bytes per row for the pixel buffer size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer); // Get the pixel buffer width and height size_t width = CVPixelBufferGetWidth(imageBuffer); size_t height = CVPixelBufferGetHeight(imageBuffer); //Generate image to edit` unsigned char* pixel = (unsigned char *)CVPixelBufferGetBaseAddress(imageBuffer); CGColorSpaceRef colorSpace=CGColorSpaceCreateDeviceRGB(); CGContextRef context=CGBitmapContextCreate(pixel, width, height, 8, bytesPerRow, colorSpace, kCGBitmapByteOrder32Little|kCGImageAlphaPremultipliedFirst); if (context != NULL) { CGImageRef imageRef = CGBitmapContextCreateImage(context); CVPixelBufferUnlockBaseAddress(imageBuffer, 0); CGColorSpaceRelease(colorSpace); CGContextRelease(context); // 解碼圖片 size_t width = CGImageGetWidth(imageRef); size_t height = CGImageGetHeight(imageRef); size_t bitsPerComponent = CGImageGetBitsPerComponent(imageRef); // CGImageGetBytesPerRow() calculates incorrectly in iOS 5.0, so defer to CGBitmapContextCreate size_t bytesPerRow = 0; CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB(); CGColorSpaceModel colorSpaceModel = CGColorSpaceGetModel(colorSpace); CGBitmapInfo bitmapInfo = CGImageGetBitmapInfo(imageRef); if (colorSpaceModel == kCGColorSpaceModelRGB) { uint32_t alpha = (bitmapInfo & kCGBitmapAlphaInfoMask); #pragma clang diagnostic push #pragma clang diagnostic ignored "-Wassign-enum" if (alpha == kCGImageAlphaNone) { bitmapInfo &= ~kCGBitmapAlphaInfoMask; bitmapInfo |= kCGImageAlphaNoneSkipFirst; } else if (!(alpha == kCGImageAlphaNoneSkipFirst || alpha == kCGImageAlphaNoneSkipLast)) { bitmapInfo &= ~kCGBitmapAlphaInfoMask; bitmapInfo |= kCGImageAlphaPremultipliedFirst; } #pragma clang diagnostic pop } CGContextRef context = CGBitmapContextCreate(NULL, width, height, bitsPerComponent, bytesPerRow, colorSpace, bitmapInfo); CGColorSpaceRelease(colorSpace); if (!context) { if (self.newVideoFrameBlock) { dispatch_async(dispatch_get_main_queue(), ^{ if (self.isCancelled) { _newVideoFrameBlock = nil; _decodeFinishedBlock = nil; return; } self.newVideoFrameBlock(imageRef, self.filePath); CGImageRelease(imageRef); }); } } else { CGContextDrawImage(context, CGRectMake(0.0f, 0.0f, width, height), imageRef); CGImageRef inflatedImageRef = CGBitmapContextCreateImage(context); CGContextRelease(context); if (self.newVideoFrameBlock) { dispatch_async(dispatch_get_main_queue(), ^{ if (self.isCancelled) { _newVideoFrameBlock = nil; _decodeFinishedBlock = nil; return; } self.newVideoFrameBlock(inflatedImageRef, self.filePath); CGImageRelease(inflatedImageRef); }); } CGImageRelease(imageRef); } if(sampleBuffer) { CMSampleBufferInvalidate(sampleBuffer); CFRelease(sampleBuffer); sampleBuffer = NULL; } else { break; } } [NSThread sleepForTimeInterval:CMTimeGetSeconds(videoTrack.minFrameDuration)]; } if (self.isCancelled) { _newVideoFrameBlock = nil; _decodeFinishedBlock = nil; return; } if (self.decodeFinishedBlock) { self.decodeFinishedBlock(self.filePath); } } }
解碼圖片是因為UIImage在界面需要顯示的時候才開始解碼,這樣可能會造成主線程的卡頓,所以在子線程對其進行解壓縮處理。
具體的使用:
- (void)playVideoWithFilePath:(NSString *)filePath_ type:(NSString *)type { @weakify(self) [[VideoPlayerManager shareInstance] decodeVideo:filePath_ withVideoPerDataBlock:^(CGImageRef imageData, NSString *filePath) { @strongify(self) if ([type isEqualToString:@"video"]) { if ([filePath isEqualToString:self.filePath]) { [self.sources.firstObject setContents:(__bridge id _Nullable)(imageData)]; } } } decodeFinishBlock:^(NSString *filePath){ [self playVideoWithFilePath:filePath type:type]; }]; }
4、其他
1、觸摸交互是覆蓋了以下方法實現:
- (void)touchesCancelled:(NSSet*)touches withEvent:(UIEvent *)event - (void)touchesCancelled:(NSSet *)touches withEvent:(UIEvent *)event - (void)touchesEnded:(NSSet *)touches withEvent:(UIEvent *)event
2、頁面上FPS的測量是使用了YYKit項目中的YYFPSLabel。
3、測試數據是微博找的,其中小視頻是Gif快手。
本文的代碼在https://github.com/hawk0620/PYQFeedDemo
本文作者:伯樂在線 - Hawk0620