I can't find a simple way of taking a photo without a camera interface. I just need to get a picture from the camera and save it to a file.
我没有找到一种没有相机界面拍照的简单方法。我只需要从相机中获取一张照片并将其保存到文件中。
6
I used this code to take a photo with frontal camera. Not all code is mine but I didn't find a link to original source. This code also produces a shutter sound. Image quality is not very good (it's quite dark) so code needs a tweak or two.
我用这段代码用前置摄像头拍照。并非所有代码都是我的,但我没有找到原始来源的链接。此代码还会产生快门声。图像质量不是很好(它很暗)所以代码需要一两个调整。
-(void) takePhoto
{
AVCaptureDevice *frontalCamera;
NSArray *allCameras = [AVCaptureDevice devicesWithMediaType:AVMediaTypeVideo];
for ( int i = 0; i < allCameras.count; i++ )
{
AVCaptureDevice *camera = [allCameras objectAtIndex:i];
if ( camera.position == AVCaptureDevicePositionFront )
{
frontalCamera = camera;
}
}
if ( frontalCamera != nil )
{
photoSession = [[AVCaptureSession alloc] init];
NSError *error;
AVCaptureDeviceInput *input =
[AVCaptureDeviceInput deviceInputWithDevice:frontalCamera error:&error];
if ( !error && [photoSession canAddInput:input] )
{
[photoSession addInput:input];
AVCaptureStillImageOutput *output = [[AVCaptureStillImageOutput alloc] init];
[output setOutputSettings:
[[NSDictionary alloc] initWithObjectsAndKeys:AVVideoCodecJPEG,AVVideoCodecKey,nil]];
if ( [photoSession canAddOutput:output] )
{
[photoSession addOutput:output];
AVCaptureConnection *videoConnection = nil;
for (AVCaptureConnection *connection in output.connections)
{
for (AVCaptureInputPort *port in [connection inputPorts])
{
if ([[port mediaType] isEqual:AVMediaTypeVideo] )
{
videoConnection = connection;
break;
}
}
if (videoConnection) { break; }
}
if ( videoConnection )
{
[photoSession startRunning];
[output captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *photo = [[UIImage alloc] initWithData:imageData];
[self processImage:photo]; //this is a custom method
}
}];
}
}
}
}
}
photoSession
is an AVCaptureSession *
ivar of the class holding the takePhoto
method.
photoSession是持有takePhoto方法的类的AVCaptureSession * ivar。
EDIT (tweak): If you change the if ( videoConnection )
block to the code below you will add 1 second delay and get a good image.
编辑(调整):如果您将if(videoConnection)块更改为下面的代码,您将添加1秒延迟并获得良好的图像。
if ( videoConnection )
{
[photoSession startRunning];
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 1 * NSEC_PER_SEC);
dispatch_after(popTime, dispatch_get_main_queue(), ^(void){
[output captureStillImageAsynchronouslyFromConnection:videoConnection
completionHandler:^(CMSampleBufferRefimageDataSampleBuffer, NSError *error) {
if (imageDataSampleBuffer != NULL)
{
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *photo = [[UIImage alloc] initWithData:imageData];
[self processImage:photo];
}
}];
});
}
If lag is not acceptable for you application you could split the code in two parts and start the photoSession
at viewDidAppear
(or somewhere similar) and simply take an immediate snapshot whenever needed - usually after some user interaction.
如果您的应用程序无法接受滞后,您可以将代码分成两部分并在viewDidAppear(或类似的地方)启动photoSession,并在需要时立即拍摄即时快照 - 通常是在一些用户交互之后。
dispatch_time_t popTime = dispatch_time(DISPATCH_TIME_NOW, 0.25 * NSEC_PER_SEC);
also produces a good result - so there is no need for a whole second lag.
也产生了良好的结果 - 因此不需要一整秒的滞后。
Note that this code is written to take a photo with frontal camera - I'm sure you will know how to mend it if you need to use back camera.
请注意,此代码用于拍摄带有正面相机的照片 - 如果您需要使用背面照相机,我相信您会知道如何修补它。
本站翻译的文章,版权归属于本站,未经许可禁止转摘,转摘请注明本文地址:http://www.silva-art.net/blog/2014/01/05/191ffa5e1cf70f3d245790d5cdb4cc39.html。