如果要使用系统的相机UI来照相或者是拍摄视频什么的可以直接使用
UIImagePickerController
由此可以看出来我们要使用这个框架的话需要有一个
AVCaptureDevice
,通过Device来获取input(输入.音频输入、视频输入等).然后通过AVCaptureSession传输给AVCaptureOutput(输出).
0 权限
不管是macOS还是iOS设备,都需要在info.plist内添加对应的键.恰好键的名字(key)是一样的.
NSMicrophoneUsageDescription
使用麦克风权限,value是使用麦克风的说明
NSCameraUsageDescription
使用相机权限,value是使用相机的说明.
PS:其实不加权限的说明也没什么.反正Debug的时候崩溃会给出原因.
在使用录制设备前需要检测权限
// Request permission to access the camera and microphone.
switch ([AVCaptureDevice authorizationStatusForMediaType:AVMediaTypeVideo]) {
case AVAuthorizationStatusAuthorized: {
/// 用户已授权
/// 设置AVCaptureSession
[self setupCaptureSession];
}
break;
case AVAuthorizationStatusNotDetermined: {
/// 该应用尚未请求过权限
[AVCaptureDevice requestAccessForMediaType:AVMediaTypeVideo completionHandler:^(BOOL granted) {/// 请求摄像头权限
if (granted) {
///设置AVCaptureSession
[self setupCaptureSession];
}
}];
}
break;
case AVAuthorizationStatusDenied: {
/// 用户拒绝授权
}
break;
case AVAuthorizationStatusRestricted: {
/// 用户无法授予访问权限(访问限制)
}
break;
}
捕获的照片的存储
iOS需要在info.plist中添加
NSPhotoLibraryUsageDescription
key.并且说明为什么要访问照片库
请求写入权限
[PHPhotoLibrary requestAuthorization:^(PHAuthorizationStatus status) {
}];
音视频的录制
[外链图片转存失败(img-A7ZWodl5-1565539430055)(\2.视频录制.png)]
从苹果的这张图可以看出来.类似AVAudioSession管理音频.AVCaptureSession管理CaptureDevice(抓取设备.可以是音频抓取设备麦克风也可以是摄像头等).途径AVCaptureSession后输出的可能是
PhotoOutput
也可能是MovieFileOutPut
或者是VideoPreviewLayer
.
1.创建捕获会话实例
self.captureSession = [[AVCaptureSession alloc] init];
2.获取捕获设备实例
self.captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
3.通过捕获设备类获取捕获设备输入实例.并且添加到captureSessiono中.
/// 添加视频输入
self.captureDevice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeVideo];
self.deviceInput = [AVCaptureDeviceInput deviceInputWithDevice:self.captureDevice error:nil];
if ([self.captureSession canAddInput:self.deviceInput]) {
[self.captureSession addInput:self.deviceInput];
} else {
}
/// 添加音频输入
self.captureDeviceVoice = [AVCaptureDevice defaultDeviceWithMediaType:AVMediaTypeAudio];
self.deviceInputVoice = [AVCaptureDeviceInput deviceInputWithDevice:self.captureDeviceVoice error:nil];
if ([self.captureSession canAddInput:self.deviceInputVoice]) {
[self.captureSession addInput:self.deviceInputVoice];
}
录制画面的预览
录制画面的预览我们需要使用
AVCaptureVideoPreviewLayer
这个类.
self.previewLayer = [[AVCaptureVideoPreviewLayer alloc] initWithSession:self.captureSession];
self.previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill;
[self.view.layer addSublayer:self.previewLayer];
self.previewLayer.frame = self.view.bounds;
注意的是,如果session没有调用
startRunning
.预览画面是不会有东西滴.
音视频的输出
/// 设置文件输出
self.movieFileOutput = [[AVCaptureMovieFileOutput alloc] init];;
if ([self.captureSession canAddOutput:self.movieFileOutput]) {
[self.captureSession addOutput:self.movieFileOutput];
}
PS:这里偷个懒,用一个BOOL属性记录一下是否在录制
- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event {
if (self.isRecording) {
[self.movieFileOutput stopRecording];
self.isRecording = NO;
} else {
[self.movieFileOutput startRecordingToOutputFileURL:[NSURL fileURLWithPath:[NSSearchPathForDirectoriesInDomains(NSDocumentDirectory, NSUserDomainMask, YES).firstObject stringByAppendingPathComponent:@"/视频录制.mp4"]] recordingDelegate:self];
self.isRecording = YES;
}
}