On our social APP, the user's activity is composed of beautiful photos, videos and text. For each photo and video, we will show the full title and five latest comments.

Because users like to use captions to tell the story behind the photos, they are usually very long, complex, and may contain hyperlinks and emojis.

Rendering such a complex text brings some problems, it causes performance degradation when scrolling.

Even on new devices such as iPhone 12, the initial text drawing of a complex title takes up to 50 milliseconds, and the text display takes up to 30 milliseconds, and the rendering speed is very slow.

The text problem is still a simple one. Sometimes we need to load more complex images or even videos.

All these steps occur on the UI thread, causing the app to drop frames when the user scrolls.

When the main thread has to process too many operations, the most common consequence is frame loss, which occurs when we cannot guarantee 60 fps (one frame every 16.67 milliseconds).

Basic knowledge

Before starting, it is best to understand the basic concepts of this text.

The main thread should not be used for heavy operations, but mainly used for:
1. Accept user input/interaction;
2. Display the results and update the UI.

Accurately identify and debug frame

Sometimes it is easy for us to find the problem of dropped frames, because the most common manifestation of dropped frames is non-responsiveness/stuttering.

We can use Union + U-APM check whether there will be a freeze on newer devices like iPhone12.

Display Yi also has a stutter on iPhone12, which infers that there is room for optimization in our code, and it is not the user's device configuration problem.

Next, we need a more accurate method to track down issues.

We tried using CADisplayLink and TimeProfiler .

Use the CADisplayLink class:

class DroppingFramesHelper: NSObject {
private var firstTime: TimeInterval = 0.0
private var lastTime: TimeInterval = 0.0
func activate() {
let link = CADisplayLink(target: self, selector: #selector(updat
link.add(to: .main, forMode: .commonModes)
}
@objc private func update(link: CADisplayLink) {
if lastTime == 0 {
firstTime = link.timestamp
lastTime = link.timestamp
}
let currentTime = link.timestamp
let elapsedTime = floor((currentTime - lastTime) * 10_000)/10
let totalElapsedTime = currentTime - firstTime
if elapsedTime > 16.7 {
print("[DFH] Frame was dropped with elpased time of \(elapse
}
lastTime = link.timestamp
}
}

Then, access an instance of it in the method AppDelegate

didFinishLaunchingWithOptions:DroppingFramesHelper().activate()

Now, if the test program has dropped frames, we can monitor them on the console:

Take measures

Now through the console and U-Meng + U-APM we know that the frame is dropped, what can we do? Some of these measures can be taken:

(1) Reduce the number of views and transparent views
(2) Minimize the load in the "continuous call function"
(3) Decoding JPEG image
(4) Off-screen rendering

We will discuss it one by one.

1. Reduce the number of views and transparent views

In order to improve the performance of the application, the first thing to do is:

• Reduce the number of views.
• Reduce transparency.

The solution is simple:

label.layer.opacity = 1.0
label.backgroundColor = .white

In order to more easily observe the overlapped transparency, we can use a very convenient tool: Debug -> View Debug -> Render -> Color Blending Layer.

This tool allows us to easily find overlapping views, as shown in the following figure:

When we don't need it, this uses tags to set the background color to be unclear.

the load in the "continuous call function" 16182256a368fc

It is easy to display. scrollViewDidScroll cellForItemAt indexPath or 06182256a36928 must be calculated very fast.

So we use the most "simple" view/cell as much as possible, and use a very light and fast algorithm. (For example, configuration that does not involve layout constraints, object allocation)

3. Decode JPEG image

When we deal with the problem of frame loss, the common "optimization point" is image decoding.

Usually, this operation is imageViews on the main thread, but when the image is very large, it will cause our application to slow down.

To alleviate this problem, one solution is to move the decoder to the background queue. In this way, the operation will not be like
UIImageView as effective as normal decoding, but mainThread will be idle.

Decoding the image in the background:

extension UIImage {
class func decodedImage(_ image: UIImage) -> UIImage? {
guard let newImage = image.cgImage else { return nil }
// To optimize this, you can some cache control.
let colorspace = CGColorSpaceCreateDeviceRGB()
let context = CGContext(data: nil,
width: newImage.width,
height: newImage.height,
bitsPerComponent: 8,
bytesPerRow: newImage.width * 4,
space: colorspace,
bitmapInfo: CGImageAlphaInfo.noneSkipFir
context?.draw(newImage, in: CGRect(x: 0, y: 0, width: newImage.w
let drawnImage = context?.makeImage()
if let drawnImage = drawnImage {
return UIImage(cgImage: drawnImage)
}
return nil
}
}

You can add some advanced cache controls to improve efficiency:

import UIKit
class AsyncImageView: UIView {
private var _image: UIImage?
var image: UIImage? {
get {
return _image
}
set {
_image = newValue
layer.contents = nil
guard let image = newValue else { return }
DispatchQueue.global(qos: .userInitiated).async {
DispatchQueue.main.sync { }
let decodedImage = UIImage.decodedImage(image)
DispatchQueue.main.async {
self.layer.contents = decodedImage?.cgImage
}
}
}
}

You can use "AsyncImageView" to decode the image in the background thread instead of the main thread.

image.png

In debug we tried not to use sync , but the program crashed. In order to find out the reason, we used Union + U-APM anomaly detection for testing.

It can be seen from the figure that the code caused the OOM memory exception alarm. This is because the memory warning is processed on the main thread, and we are processing the image in the background, so if we use too much memory, unexpected behavior will occur And bring extreme risks (such as the crash in the picture)

4. Off-screen rendering

When we deal with specific attributes of UI elements, we may encounter some off-screen rendering problems, because we need to prepare these elements before rendering them. This means that CPU and GPU are used a lot.

discover this problem?

We used tools: Debug -> View Debugging -> Rendering -> Color Offscreen-Rendered Yellow.

Similar to the previous example at the second point, using this tool, we can find elements that are highlighted in color or red.

The following code content:

imageView.layer.cornerRadius = avatarImageHeight / 2.0

We use UIBezierPath instead, which can simply solve specific off-screen rendering problems:

In short, the following is some experience gained through debugging:

1. Avoid the CornerRadius attribute;
2. Avoid using ShouldRasterize;
3. Use the .rounded() value because it is easier to calculate.
4. Shadows will also cause off-screen rendering

Other suggestions

Readers can also try the following optimization suggestions:

1. Text measurement (boudingRectWithSize), but the debugging process may be very onerous. Please avoid using them unless you need them frequently.
2. Check the structure layout, especially when using automatic layout and must support old equipment.
3. Try to put the work in the background queue, but please pay attention to the memory warning.


六一
556 声望347 粉丝

SegmentFault 新媒体运营