iOS音视频播放(一)AVFoundation

版权声明:本文为博主原创文章,未经博主允许不得转载。 https://blog.csdn.net/szk972092933/article/details/82733874

系统提供AVFoundation和AVKit两个库,来支持用户实现音视频播放功能。

../Art/media_playback_framework_2x.png

AVKit是基于AVFoundation进行封装的,提供基本的播放界面,但是AVFoundation可以提供更多高级的功能。,使用AVPlayerController可以很方便的实现一个音视频播放器

func playVideo(_ sender: UIButton) {
        
    guard let url = URL(string: "https://devimages-cdn.apple.com/samplecode/avfoundationMedia/AVFoundationQueuePlayer_HLS2/master.m3u8") else { return };
        
        let player = AVPlayer(url: url);
        
        let controller = AVPlayerViewController();
        controller.player = player;
        
        present(controller, animated: true) {
            player.play();
        }
 }

如果需要更多高级的功能,需要使用AVFoundation类。

下面介绍几个重要的类:

1.Audio Session, 在播放音视频之前,系统会设置一个默认的audio session,它是操作系统和你App进行音频管理的媒介。

默认情况下,它支持音频播放,而不支持录音;当在设置中将响铃设置为静音模式下,音频播放也会静音;设备锁定的时候会静音,当前音频播放的时候其他后台播放的音频也会静音。此时需要设置你需要的Audio Session模式

    let audioSession = AVAudioSession.sharedInstance()
    do {
        try audioSession.setCategory(AVAudioSessionCategoryPlayback)
    } catch {
        print("Setting category to AVAudioSessionCategoryPlayback failed.")
    }

设置audioSession的类别为AVAudioSessionCategoryPlayback,代表音频播放是你App的主要功能,它允许在静音模式下播放音频,并且当设置后台运行模式为 Audio, AirPlay, and Picture in Picture时,可以让你的App在后台播放音频。

../Art/background_modes.shot/Resources/shot_2x.png

2.AVAssert一个资源类,负责加载音视频资源。多媒体资源的静态模型

扫描二维码关注公众号,回复: 3272007 查看本文章

AVAssetTrack是一个音视频流的数据模型。

let url: URL = // Local or Remote Asset URL
let asset = AVAsset(url: url)
//AVAsset是一个抽象类,实际使用AVURLAsset来初始化。如果需要更多的控制那么使用AVURLAsset来初始化
let url: URL = // Remote Asset URL
let options = [AVURLAssetAllowsCellularAccessKey: false]//设置移动蜂窝网络下不会读取资源,只有在WiFi网络下才会加载资源
let asset = AVURLAsset(url: url, options: options)

资源的加载是同步的,如果在主线程中加载较大的资源那么可能会导致线程阻塞。可以使用异步加载,来避免这种情况。可以使用

statusOfValueForKey:error:方法来查看加载的进度。由于该方法在background queue里调用,因此,当需要界面的刷新时,需要回到主线程里去执行。

// URL of a bundle asset called 'example.mp4'
let url = Bundle.main.url(forResource: "example", withExtension: "mp4")!
let asset = AVAsset(url: url)
let playableKey = "playable"
 
// Load the "playable" property
asset.loadValuesAsynchronously(forKeys: [playableKey]) {
    var error: NSError? = nil
    let status = asset.statusOfValue(forKey: playableKey, error: &error)
    switch status {
    case .loaded:
        // Sucessfully loaded. Continue processing.
    case .failed:
        // Handle error
    case .cancelled:
        // Terminate processing
    default:
        // Handle all other cases
    }
}

3.AVMetadataItem读取AVAsset的元数据,AVFoundation类将元数据分为两类Fomat-specific keyspaces和 common key space。

一个assert可能包含多种形式的metadata,可以使用metadata属性读取所有形式的元数据。

let url = Bundle.main.url(forResource: "audio", withExtension: "m4a")!
let asset = AVAsset(url: url)
let formatsKey = "availableMetadataFormats"
asset.loadValuesAsynchronously(forKeys: [formatsKey]) {
    var error: NSError? = nil
    let status = asset.statusOfValue(forKey: formatsKey, error: &error)
    if status == .loaded {
        for format in asset.availableMetadataFormats {
            let metadata = asset.metadata(forFormat: format)
            // process format-specific metadata collection
        }
    }
}

获取到metadata以后,需要读取metadata的值:

// Collection of "common" metadata
let metadata = asset.commonMetadata
// Filter metadata to find the asset's artwork
let artworkItems =
    AVMetadataItem.metadataItems(from: metadata,
                                 filteredByIdentifier: AVMetadataCommonIdentifierArtwork)
if let artworkItem = artworkItems.first {
    // Coerce the value to an NSData using its dataValue property
    if let imageData = artworkItem.dataValue {
        let image = UIImage(data: imageData)
        // process image
    } else {
        // No image data found
    }
}

4.AVPlayer,负责播放音视频及时间管理,AVPlayer一次只能播放一个资源,如果需要顺序播放多个资源,可以使用它的子类AVQueuePlayer来管理播放队列。

5.AVPlayerItem,负责和AVPlayer交互,管理媒体资源的动态部分,管理资源的时间和状态。

6.AVKit和AVPlayerLayer负责播放视图界面。

以上类之间的关系可以用一张层次图来表示:

创建一个播放器的示例如下:

class PlayerViewController: UIViewController {
 
    @IBOutlet weak var playerViewController: AVPlayerViewController!
 
    var player: AVPlayer!
    var playerItem: AVPlayerItem!
 
    override func viewDidLoad() {
        super.viewDidLoad()
 
        // 1) Define asset URL
        let url: URL = // URL to local or streamed media
 
        // 2) Create asset instance
        let asset = AVAsset(url: url)
 
        // 3) Create player item
        playerItem = AVPlayerItem(asset: asset)
 
        // 4) Create player instance
        player = AVPlayer(playerItem: playerItem)
 
        // 5) Associate player with view controller
        playerViewController.player = player
    }
 
}

可以使用KVO来监控播放的状态:

let url: URL = // Asset URL
 
var asset: AVAsset!
var player: AVPlayer!
var playerItem: AVPlayerItem!
 
// Key-value observing context
private var playerItemContext = 0
 
let requiredAssetKeys = [
    "playable",
    "hasProtectedContent"
]
 
func prepareToPlay() {
    // Create the asset to play
    asset = AVAsset(url: url)
 
    // Create a new AVPlayerItem with the asset and an
    // array of asset keys to be automatically loaded
    playerItem = AVPlayerItem(asset: asset,
                              automaticallyLoadedAssetKeys: requiredAssetKeys)
 
    // Register as an observer of the player item's status property
    playerItem.addObserver(self,
                           forKeyPath: #keyPath(AVPlayerItem.status),
                           options: [.old, .new],
                           context: &playerItemContext)
 
    // Associate the player item with the player
    player = AVPlayer(playerItem: playerItem)
}

override func observeValue(forKeyPath keyPath: String?,
                           of object: Any?,
                           change: [NSKeyValueChangeKey : Any]?,
                           context: UnsafeMutableRawPointer?) {
 
    // Only handle observations for the playerItemContext
    guard context == &playerItemContext else {
        super.observeValue(forKeyPath: keyPath,
                           of: object,
                           change: change,
                           context: context)
        return
    }
 
    if keyPath == #keyPath(AVPlayerItem.status) {
        let status: AVPlayerItemStatus
        if let statusNumber = change?[.newKey] as? NSNumber {
            status = AVPlayerItemStatus(rawValue: statusNumber.intValue)!
        } else {
            status = .unknown
        }
        // Switch over status value
        switch status {
        case .readyToPlay:
            // Player item is ready to play.
        case .failed:
            // Player item failed. See error.
        case .unknown:
            // Player item is not yet ready.
        }
    }
}

处理时间相关的操作:监听播放进度,Core Media提供一个低级的API CMTime来表示播放时间,

// 0.25 seconds
let quarterSecond = CMTime(value: 1, timescale: 4)
 
// 10 second mark in a 44.1 kHz audio file
let tenSeconds = CMTime(value: 441000, timescale: 44100)
 
// 3 seconds into a 30fps video
let cursor = CMTime(value: 90, timescale: 30)

player提供两种监听时间的方式:

addPeriodicTimeObserver和addBoundaryTimeObserver来实现每隔一定时间监听,和特定的时间点监听,并且支持按时间查找
var player: AVPlayer!
var playerItem: AVPlayerItem!
var timeObserverToken: Any?
 
func addPeriodicTimeObserver() {
    // Notify every half second
    let timeScale = CMTimeScale(NSEC_PER_SEC)
    let time = CMTime(seconds: 0.5, preferredTimescale: timeScale)
    timeObserverToken = player.addPeriodicTimeObserver(forInterval: time,
                                                       queue: .main) {
        [weak self] time in
        // update player transport UI
    }
}
 
func removePeriodicTimeObserver() {
    if let timeObserverToken = timeObserverToken {
        player.removeTimeObserver(timeObserverToken)
        self.timeObserverToken = nil
    }
}
var asset: AVAsset!
var player: AVPlayer!
var playerItem: AVPlayerItem!
var timeObserverToken: Any?
 
func addBoundaryTimeObserver() {
 
    // Divide the asset's duration into quarters.
    let interval = CMTimeMultiplyByFloat64(asset.duration, 0.25)
    var currentTime = kCMTimeZero
    var times = [NSValue]()
 
    // Calculate boundary times
    while currentTime < asset.duration {
        currentTime = currentTime + interval
        times.append(NSValue(time:currentTime))
    }
 
    timeObserverToken = player.addBoundaryTimeObserver(forTimes: times,
                                                       queue: .main) {
        // Update UI
    }
}
 
func removeBoundaryTimeObserver() {
    if let timeObserverToken = timeObserverToken {
        player.removeTimeObserver(timeObserverToken)
        self.timeObserverToken = nil
    }
}
// Seek to the 2 minute mark
let time = CMTime(value: 120, timescale: 1)
player.seek(to: time)
// Seek to the first frame at 3:25 mark
let seekTime = CMTime(seconds: 205, preferredTimescale: Int32(NSEC_PER_SEC))
player.seek(to: seekTime, toleranceBefore: kCMTimeZero, toleranceAfter: kCMTimeZero)

猜你喜欢

转载自blog.csdn.net/szk972092933/article/details/82733874