在ALAssetRepresentation中解释XMP元数据


95

当用户对iOS上内置的Photos.app中的照片进行某些更改(裁剪,消除红眼,...)时,更改不会应用于fullResolutionImage相应的ALAssetRepresentation

然而,这些变化应用到thumbnailfullScreenImage被返回ALAssetRepresentation。此外,可以ALAssetRepresentation通过键在的元数据字典中找到有关已应用更改的信息@"AdjustmentXMP"

我想将这些更改应用于fullResolutionImage自己,以保持一致性。我发现,在iOS6 +上 CIFilterfilterArrayFromSerializedXMP: inputImageExtent:error:可以将此XMP元数据转换为的数组CIFilter

ALAssetRepresentation *rep; 
NSString *xmpString = rep.metadata[@"AdjustmentXMP"];
NSData *xmpData = [xmpString dataUsingEncoding:NSUTF8StringEncoding];

CIImage *image = [CIImage imageWithCGImage:rep.fullResolutionImage];

NSError *error = nil;
NSArray *filterArray = [CIFilter filterArrayFromSerializedXMP:xmpData 
                                             inputImageExtent:image.extent 
                                                        error:&error];
if (error) {
     NSLog(@"Error during CIFilter creation: %@", [error localizedDescription]);
}

CIContext *context = [CIContext contextWithOptions:nil];

for (CIFilter *filter in filterArray) {
     [filter setValue:image forKey:kCIInputImageKey];
     image = [filter outputImage];
}

但是,这仅适用于某些滤镜(裁剪,自动增强),而不适用于诸如红眼消除之类的其他滤镜。在这些情况下,CIFilters没有可见效果。因此,我的问题是:

  • 有谁知道一种消除红眼的方法CIFilter?(以与Photos.app一致的方式。带有键的过滤器kCIImageAutoAdjustRedEye是不够的。例如,它不获取用于眼睛位置的参数。)
  • 是否有可能在iOS 5下生成并应用这些过滤器?

该链接指向另一个Stackoverflow问题,该问题提供了红眼算法。数量不多,但这只是一个开始。stackoverflow.com/questions/133675/red-eye-reduction-algorithm
Roecrew

在iOS 7上,列出的代码正确地应用了防红眼滤镜(CIRedEyeCorrections内部滤镜)。
paiv 2014年

Answers:


2
ALAssetRepresentation* representation = [[self assetAtIndex:index] defaultRepresentation];

// Create a buffer to hold the data for the asset's image
uint8_t *buffer = (Byte*)malloc(representation.size); // Copy the data from the asset into the buffer
NSUInteger length = [representation getBytes:buffer fromOffset: 0.0  length:representation.size error:nil];

if (length==0)
    return nil;

// Convert the buffer into a NSData object, and free the buffer after.

NSData *adata = [[NSData alloc] initWithBytesNoCopy:buffer length:representation.size freeWhenDone:YES];

// Set up a dictionary with a UTI hint. The UTI hint identifies the type
// of image we are dealing with (that is, a jpeg, png, or a possible
// RAW file).

// Specify the source hint.

NSDictionary* sourceOptionsDict = [NSDictionary dictionaryWithObjectsAndKeys:

(id)[representation UTI], kCGImageSourceTypeIdentifierHint, nil];

// Create a CGImageSource with the NSData. A image source can
// contain x number of thumbnails and full images.

CGImageSourceRef sourceRef = CGImageSourceCreateWithData((CFDataRef) adata,  (CFDictionaryRef) sourceOptionsDict);

[adata release];

CFDictionaryRef imagePropertiesDictionary;

// Get a copy of the image properties from the CGImageSourceRef.

imagePropertiesDictionary = CGImageSourceCopyPropertiesAtIndex(sourceRef,0, NULL);

CFNumberRef imageWidth = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelWidth);

CFNumberRef imageHeight = (CFNumberRef)CFDictionaryGetValue(imagePropertiesDictionary, kCGImagePropertyPixelHeight);

int w = 0;

int h = 0;

CFNumberGetValue(imageWidth, kCFNumberIntType, &w);

CFNumberGetValue(imageHeight, kCFNumberIntType, &h);

// Clean up memory

CFRelease(imagePropertiesDictionary);
By using our site, you acknowledge that you have read and understand our Cookie Policy and Privacy Policy.
Licensed under cc by-sa 3.0 with attribution required.