相机原理
感光度(ISO)
ISO也叫胶卷速度,衡量图像传感器对光的敏感程度。亮的时候需要的ISO就越小,暗的时候相反ISO就高噪点会多。
光圈
光圈用来衡量到达图像感应器的光所通过的通孔大小。iPhone6的光圈值是f/2.2,2.2就是表示镜头焦距和光圈的有效直径比例。
对焦
将离相机一定范围内物体渲染清晰,太近太远会模糊这种情况叫做失焦。
保存获取文件
关于Core Image处理RAW格式可以参考CoreImage/CIRAWFilter.h,以及WWDC上的讲座WWDC 2014 session 514 https://developer.apple.com/videos/wwdc/2014/#514
如何处理图像数据
处理位图的类是UIImage,CGImage(Core Graphics)和CIImage(Core Image)。从NSData得到UIImage使用imageWithContentsOfFiles:方法。
从相机捕捉图像
相比较UIImagePickerController使用AVFoundation能够直接访问相机,提供完全的操作权,比如用编程方式更改硬件参数,或者操纵实时预览图。
先创建AVCaptureStillImageOutput对象,使用captureStillImageAsynchronouslyFromConnection: completionHandler: 方法。
使用AVCaptureStillImageOutput中的类方法jpegStillImageNSDataRepresentation:将其转换成NSData对象,接着用imageWithData:得到一个UIImage
过程中可以调节很多参数例如曝光,聚焦,光补偿,闪光灯,ISO等。所有这些设置都会被应用到一个AVCaptureDevice里。
AVFoundation相关类
可以通过以下类访问来自相机设备原始数据并控制他们的组件。
AVCaptureDevice:控制硬件特性,比如镜头位置,曝光,闪光灯
AVCaptureDeviceInput:设备数据
AVCaptureOutput:抽象类包括了三种静态图片捕捉类
AVCaptureStillImageOutput:用于捕捉静态图片
AVCaptureMetadataOutput:启用检测人脸和二维码
AVCaptureVideoOutput:实时预览图提供原始帧
AVCaptureSession:管理输入输出之间数据流,以及在出现问题时生成运行时错误。
AVCaptureVideoPreviewLayer:CALayer的子类,用于自动显示相机产生的实时图像,
捕捉设置
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145
|
let session = AVCaptureSession()
let availableCameraDevices = AVCaptureDevice.devicesWithMediaType(AVMediaTypeVideo)
for device in availableCameraDevices as [AVCaptureDevice] {
if device.position == .Back {
backCameraDevice = device
}
else if device.position == .Front {
frontCameraDevice = device
}
}
var error:NSError?
let possibleCameraInput: AnyObject? = AVCaptureDeviceInput.deviceInputWithDevice(backCameraDevice, error: &error)
if let backCameraInput = possibleCameraInput as? AVCaptureDeviceInput {
if self.session.canAddInput(backCameraInput) {
self.session.addInput(backCameraInput)
}
}
let authorizationStatus = AVCaptureDevice.authorizationStatusForMediaType(AVMediaTypeVideo)
switch authorizationStatus {
case .NotDetermined:
AVCaptureDevice.requestAccessForMediaType(AVMediaTypeVideo,
completionHandler: { (granted:Bool) -> Void in
if granted {
}
else {
}
})
case .Authorized:
case .Denied, .Restricted:
}
previewLayer = AVCaptureVideoPreviewLayer.layerWithSession(session) as AVCaptureVideoPreviewLayer
previewLayer.frame = view.bounds
view.layer.addSublayer(previewLayer)
glContext = EAGLContext(API: .OpenGLES2)
glView = GLKView(frame: viewFrame, context: glContext)
ciContext = CIContext(EAGLContext: glContext)
videoOutput = AVCaptureVideoDataOutput()
videoOutput.setSampleBufferDelegate(self, queue: dispatch_queue_create("sample buffer delegate", DISPATCH_QUEUE_SERIAL))
if session.canAddOutput(self.videoOutput) {
session.addOutput(self.videoOutput)
}
func captureOutput(captureOutput: AVCaptureOutput!, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, fromConnection connection: AVCaptureConnection!) {
let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)
let image = CIImage(CVPixelBuffer: pixelBuffer)
if glContext != EAGLContext.currentContext() {
EAGLContext.setCurrentContext(glContext)
}
glView.bindDrawable()
ciContext.drawImage(image, inRect:image.extent(), fromRect: image.extent())
glView.display()
}
stillCameraOutput = AVCaptureStillImageOutput()
if self.session.canAddOutput(self.stillCameraOutput) {
self.session.addOutput(self.stillCameraOutput)
}
session.sessionPreset = AVCaptureSessionPresetPhoto
|
操作相机
到了iOS8后可以对所有参数进行手动调整了。包括镜头光圈。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
|
sessionQueue = dispatch_queue_create("com.example.camera.capture_session", DISPATCH_QUEUE_SERIAL)
dispatch_async(sessionQueue) { () -> Void in
self.session.startRunning()
}
var error:NSError?
if currentDevice.lockForConfiguration(&error) {
}
else {
}
|
对焦
AVCaptureFocusMode枚举描述可用对焦模式
对焦可以使用UISlider设置,类似单反对焦环。手动对焦有个辅助标识指向清晰区域,可以通过对焦峰值(focus peaking)将对焦区域高亮显示的方式,方法是使用阈值边缘(threshold edge)滤镜,自定义CIFilter或GPUImageThresholdEdgeDetectionFilter,并调用AVCaptureAudioDataOutputSampleBufferDelegate下的captureOutput(_:didOutputSampleBuffer:fromConnection:)方法将它覆盖到实时预览图上。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48
|
let focusMode:AVCaptureFocusMode = ...
if currentCameraDevice.isFocusModeSupported(focusMode) {
...
currentCameraDevice.focusMode = focusMode
...
}
var pointInPreview = focusTapGR.locationInView(focusTapGR.view)
var pointInCamera = previewLayer.captureDevicePointOfInterestForPoint(pointInPreview)
...
currentCameraDevice.focusPointOfInterest = pointInCamera
currentCameraDevice.focusMode = .AutoFocus
...
...
var lensPosition:Float = ...
currentCameraDevice.setFocusModeLockedWithLensPosition(lensPosition) {
(timestamp:CMTime) -> Void in
}
...
|
曝光
曝光档数范围在minExposureTargetBias和maxExposureTargetBias之间。0为默认没有补偿
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27
| var exposureBias:Float = ...
...
currentDevice.setExposureTargetBias(exposureBias) { (time:CMTime) -> Void in
}
...
var activeFormat = currentDevice.activeFormat
var duration:CTime = ...
var iso:Float = ...
...
currentDevice.setExposureModeCustomWithDuration(duration, ISO: iso) { (time:CMTime) -> Void in
}
...
|
白平衡
iOS8可以手动控制白平衡。可以通过开尔文所表示的温度来调节色温和色彩。典型色温值在2000-3000K(类似蜡烛或灯泡的暖光源)到8000K(纯净的蓝色天空)之间。色彩范围从最小的-150(偏绿)到150(偏品红)。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19
| var incandescentLightCompensation = 3_000
var tint = 0
let temperatureAndTintValues = AVCaptureWhiteBalanceTemperatureAndTintValues(temperature: incandescentLightCompensation, tint: tint)
var deviceGains = currentCameraDevice.deviceWhiteBalanceGainsForTemperatureAndTintValues(temperatureAndTintValues)
...
currentCameraDevice.setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains(deviceGains) {
(timestamp:CMTime) -> Void in
}
}
...
|
实时人脸检测
使用AVCaptureMetadataOutput可以检测人脸和二维码。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25
| var metadataOutput = AVCaptureMetadataOutput()
metadataOutput.setMetadataObjectsDelegate(self, queue: self.sessionQueue)
if session.canAddOutput(metadataOutput) {
session.addOutput(metadataOutput)
}
metadataOutput.metadataObjectTypes = [AVMetadataObjectTypeFace]
func captureOutput(captureOutput: AVCaptureOutput!, didOutputMetadataObjects metadataObjects: [AnyObject]!, fromConnection connection: AVCaptureConnection!) {
for metadataObject in metadataObjects as [AVMetadataObject] {
if metadataObject.type == AVMetadataObjectTypeFace {
var transformedMetadataObject = previewLayer.transformedMetadataObjectForMetadataObject(metadataObject)
}
}
}
|
捕捉静态图片
捕捉高分辨率图像调用captureStillImageAsynchronouslyFromConnection(connection, completionHandler)。在视觉上反应图片捕捉何时开始以及何时结束可以使用KVO来观察AVCaptureStillImageOutput的isCapturingStillImage属性。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43
| dispatch_async(sessionQueue) { () -> Void in
let connection = self.stillCameraOutput.connectionWithMediaType(AVMediaTypeVideo)
connection.videoOrientation = AVCaptureVideoOrientation(rawValue: UIDevice.currentDevice().orientation.rawValue)!
self.stillCameraOutput.captureStillImageAsynchronouslyFromConnection(connection) {
(imageDataSampleBuffer, error) -> Void in
if error == nil {
let imageData = AVCaptureStillImageOutput.jpegStillImageNSDataRepresentation(imageDataSampleBuffer)
let metadata:NSDictionary = CMCopyDictionaryOfAttachments(nil, imageDataSampleBuffer, CMAttachmentMode(kCMAttachmentMode_ShouldPropagate)).takeUnretainedValue()
if let image = UIImage(data: imageData) {
...
}
}
else {
NSLog("error while capturing still image: \(error)")
}
}
}
|
分级捕捉
通过设定-1,0,-1三个不同曝光档数用HDR算法合并成一张。
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31
| dispatch_async(sessionQueue) { () -> Void in
let connection = self.stillCameraOutput.connectionWithMediaType(AVMediaTypeVideo)
connection.videoOrientation = AVCaptureVideoOrientation(rawValue: UIDevice.currentDevice().orientation.rawValue)!
var settings = [-1.0, 0.0, 1.0].map {
(bias:Float) -> AVCaptureAutoExposureBracketedStillImageSettings in
AVCaptureAutoExposureBracketedStillImageSettings.autoExposureSettingsWithExposureTargetBias(bias)
}
var counter = settings.count
self.stillCameraOutput.captureStillImageBracketAsynchronouslyFromConnection(connection, withSettingsArray: settings) {
(sampleBuffer, settings, error) -> Void in
...
counter--
}
}
|
操作图像
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98
|
-(UIImage*)composeStereogramLeft:(UIImage *)leftImage right:(UIImage *)rightImage
{
float w = leftImage.size.width;
float h = leftImage.size.height;
UIGraphicsBeginImageContext(CGSizeMake(w * 2.0, h + 32.0));
[leftImage drawAtPoint:CGPointMake(0.0, 32.0)];
[rightImage drawAtPoint:CGPointMake(w, 32.0)];
float leftCircleX = (w / 2.0) - 8.0;
float rightCircleX = leftCircleX + w;
float circleY = 8.0;
[[UIColor blackColor] setFill];
UIRectFill(CGRectMake(0.0, 0.0, w * 2.0, 32.0));
[[UIColor whiteColor] setFill];
CGRect leftRect = CGRectMake(leftCircleX, circleY, 16.0, 16.0);
CGRect rightRect = CGRectMake(rightCircleX, circleY, 16.0, 16.0);
UIBezierPath *path = [UIBezierPath bezierPathWithOvalInRect:leftRect];
[path appendPath:[UIBezierPath bezierPathWithOvalInRect:rightRect]];
[path fill];
UIImage *savedImg = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return savedImg;
}
UInt8 *rightPtr = rightBitmap;
UInt8 *leftPtr = leftBitmap;
UInt8 r1, g1, b1;
UInt8 r2, g2, b2;
UInt8 ra, ga, ba;
for (NSUInteger idx = 0; idx < bitmapByteCount; idx += 4) {
r1 = rightPtr[0]; g1 = rightPtr[1]; b1 = rightPtr[2];
r2 = leftPtr[0]; g2 = leftPtr[1]; b2 = leftPtr[2];
ra = 0.7 * g1 + 0.3 * b1;
ga = b2;
ba = b2;
leftPtr[0] = ra;
leftPtr[1] = ga;
leftPtr[2] = ba;
rightPtr += 4;
leftPtr += 4;
}
CGImageRef composedImage = CGBitmapContextCreateImage(_leftContext);
UIImage *retval = [UIImage imageWithCGImage:composedImage];
CGImageRelease(composedImage);
return retval;
|
元数据
存储图像信息的标准格式是Exif(可交换图像文件格式)。通常会保存照相时间日期,快门速度和光圈,设备支持还会包括GPS坐标。可以使用CGImageSourceCopyPropertiesAtIndex方法访问Exif信息。