Skip to content

AVFoundation iOS xcode9 beta2

Vincent Dondain edited this page Jun 21, 2017 · 1 revision

#AVFoundation.framework

diff -ruN /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCameraCalibrationData.h /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCameraCalibrationData.h
--- /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCameraCalibrationData.h	2017-05-24 00:28:27.000000000 -0400
+++ /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCameraCalibrationData.h	2017-06-11 18:48:49.000000000 -0400
@@ -87,7 +87,7 @@
     An NSData of floats describing the camera lens' radial distortions.
  
  @discussion
-    Images captured by a camera are geometrically warped by radial distortions in the lens. In order to project from the 2D image plane back into the 3D world, the images must be distortion corrected, or made rectilinear. Lens distortion is modeled using a one-dimensional lookup table of 32-bit float values evenly distributed along a radius from the center of the distortion to the farthest corner, with each value representing an elongation or compression of the radius (1.0 for any given point indicates no elongation). This model assumes radially symmetric lens distortion. When dealing with AVDepthData, the disparity / depth map representations are geometrically distorted to align with images produced by the camera. For more information, see the reference implementation below.
+    Images captured by a camera are geometrically warped by radial distortions in the lens. In order to project from the 2D image plane back into the 3D world, the images must be distortion corrected, or made rectilinear. Lens distortion is modeled using a one-dimensional lookup table of 32-bit float values evenly distributed along a radius from the center of the distortion to the farthest corner, with each value representing an elongation or compression of the radius (0.0 for any given point indicates no elongation). This model assumes radially symmetric lens distortion. When dealing with AVDepthData, the disparity / depth map representations are geometrically distorted to align with images produced by the camera. For more information, see the reference implementation below.
  */
 @property(nonatomic, readonly) NSData *lensDistortionLookupTable;
 
@@ -116,12 +116,14 @@
  
     To apply distortion correction to an image, you'd begin with an empty destination buffer and iterate through it row by row, calling the sample implementation below for each point in the output image, passing the lensDistortionLookupTable to find the corresponding value in the distorted image, and write it to your output buffer. Please note that the "point", "opticalCenter", and "imageSize" parameters below must be in the same coordinate system, i.e. both at full resolution, or both scaled to a different resolution but with the same aspect ratio.
  
+    The reference function below returns floating-point x and y values. If you wish to match the results with actual pixels in a bitmap, you should either round to the nearest integer value or interpolate from surrounding integer positions (i.e. bilinear interpolation from the 4 surrounding pixels).
+ 
 - (CGPoint)lensDistortionPointForPoint:(CGPoint)point
                            lookupTable:(NSData *)lookupTable
                distortionOpticalCenter:(CGPoint)opticalCenter
                              imageSize:(CGSize)imageSize
 {
-    // The lookup table holds the radial magnification for n linearly spaced radii.
+    // The lookup table holds the relative radial magnification for n linearly spaced radii.
     // The first position corresponds to radius = 0
     // The last position corresponds to the largest radius found in the image.
  
@@ -137,7 +139,7 @@
     // Determine the radius of the given point.
     float r_point = sqrtf( v_point_x * v_point_x + v_point_y * v_point_y );
  
-    // Look up the radial magnification to apply in the provided lookup table
+    // Look up the relative radial magnification to apply in the provided lookup table
     float magnification;
     const float *lookupTableValues = lookupTable.bytes;
     NSUInteger lookupTableCount = lookupTable.length / sizeof(float);
@@ -158,8 +160,8 @@
     }
  
     // Apply radial magnification
-    float new_v_point_x = magnification * v_point_x;
-    float new_v_point_y = magnification * v_point_y;
+    float new_v_point_x = v_point_x + magnification * v_point_x;
+    float new_v_point_y = v_point_y + magnification * v_point_y;
  
     // Construct output
     return CGPointMake( opticalCenter.x + new_v_point_x, opticalCenter.y + new_v_point_y );
diff -ruN /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureDevice.h /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureDevice.h
--- /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureDevice.h	2017-05-24 00:28:27.000000000 -0400
+++ /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCaptureDevice.h	2017-06-11 18:48:50.000000000 -0400
@@ -996,7 +996,7 @@
  @constant AVCaptureExposureDurationCurrent
     A special value that may be passed as the duration parameter of setExposureModeCustomWithDuration:ISO:completionHandler: to indicate that the caller does not wish to specify a value for the exposureDuration property, and that it should instead be set to its current value. Note that the device may be adjusting exposureDuration at the time of the call, in which case the value to which exposureDuration is set may differ from the value obtained by querying the exposureDuration property.
  */
-AVF_EXPORT const CMTime AVCaptureExposureDurationCurrent NS_AVAILABLE_IOS(8_0);
+AVF_EXPORT const CMTime AVCaptureExposureDurationCurrent NS_AVAILABLE_IOS(8_0) __TVOS_PROHIBITED;
 
 /*!
  @constant AVCaptureISOCurrent
@@ -1219,7 +1219,7 @@
  @constant AVCaptureWhiteBalanceGainsCurrent
     A special value that may be passed as a parameter of setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:completionHandler: to indicate that the caller does not wish to specify a value for deviceWhiteBalanceGains, and that gains should instead be locked at their value at the moment that white balance is locked.
  */
-AVF_EXPORT const AVCaptureWhiteBalanceGains AVCaptureWhiteBalanceGainsCurrent NS_AVAILABLE_IOS(8_0);
+AVF_EXPORT const AVCaptureWhiteBalanceGains AVCaptureWhiteBalanceGainsCurrent NS_AVAILABLE_IOS(8_0) __TVOS_PROHIBITED;
 
 /*!
  @method setWhiteBalanceModeLockedWithDeviceWhiteBalanceGains:completionHandler:
diff -ruN /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCapturePhotoOutput.h /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCapturePhotoOutput.h
--- /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCapturePhotoOutput.h	2017-05-24 00:37:44.000000000 -0400
+++ /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVCapturePhotoOutput.h	2017-06-12 01:08:15.000000000 -0400
@@ -1356,6 +1356,9 @@
  
  @result
     A CGImageRef, or nil if the conversion process fails.
+ 
+ @discussion
+    Each time you access this method, AVCapturePhoto generates a new CGImageRef. When backed by a compressed container (such as HEIC), the CGImageRepresentation is decoded lazily as needed. When backed by an uncompressed format such as BGRA, it is copied into a separate backing buffer whose lifetime is not tied to that of the AVCapturePhoto. For a 12 megapixel image, a BGRA CGImage represents ~48 megabytes per call. If you only intend to use the CGImage for on-screen rendering, use the previewCGImageRepresentation instead. Note that the physical rotation of the CGImageRef matches that of the main image. Exif orientation has not been applied. If you wish to apply rotation when working with UIImage, you can do so by querying the photo's metadata[kCGImagePropertyOrientation] value, and passing it as the orientation parameter to +[UIImage imageWithCGImage:scale:orientation:]. RAW images always return a CGImageRepresentation of nil. If you wish to make a CGImageRef from a RAW image, use CIRAWFilter in the CoreImage framework.
  */
 - (nullable CGImageRef)CGImageRepresentation NS_AVAILABLE_IOS(11_0);
 
@@ -1366,6 +1369,9 @@
  
  @result
     A CGImageRef, or nil if the conversion process fails, or if you did not request a preview photo.
+ 
+ @discussion
+    Each time you access this method, AVCapturePhoto generates a new CGImageRef. This CGImageRepresentation is a RGB rendering of the previewPixelBuffer property. If you did not request a preview photo by setting the -[AVCapturePhotoSettings previewPhotoFormat] property, this method returns nil. Note that the physical rotation of the CGImageRef matches that of the main image. Exif orientation has not been applied. If you wish to apply rotation when working with UIImage, you can do so by querying the photo's metadata[kCGImagePropertyOrientation] value, and passing it as the orientation parameter to +[UIImage imageWithCGImage:scale:orientation:].
  */
 - (nullable CGImageRef)previewCGImageRepresentation NS_AVAILABLE_IOS(11_0);
 
diff -ruN /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVContentKeySession.h /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVContentKeySession.h
--- /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVContentKeySession.h	2017-05-24 00:28:08.000000000 -0400
+++ /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVContentKeySession.h	2017-06-12 01:08:15.000000000 -0400
@@ -191,10 +191,10 @@
                 An opaque identifier for the application. The contents of this identifier depend on the particular protocol in use by the entity that controls the use of the media data.
  @param         storageURL
                 URL to a directory previously used with one or more instances of AVContentKeySession for the storage of expired session reports.
- @result        An NSArray containing instances of NSDictionary, each containing a pending expired session report. These contents depend on the particular protocol in use by the entity that controls the use of the media data.
+ @result        An NSArray containing instances of NSData, each containing a pending expired session report as a property-list serialization of an NSDictionary object. The contents of expired session reports depend on the particular protocol in use by the entity that controls the use of the media data.
  @discussion    Note that no reports for sessions still in progress will be included.
 */
-+ (NSArray <NSDictionary *> *)pendingExpiredSessionReportsWithAppIdentifier:(NSData *)appIdentifier storageDirectoryAtURL:(NSURL *)storageURL;
++ (NSArray <NSData *> *)pendingExpiredSessionReportsWithAppIdentifier:(NSData *)appIdentifier storageDirectoryAtURL:(NSURL *)storageURL;
 
 /*! 
  @method        removePendingExpiredSessionReports:withAppIdentifier:storageDirectoryAtURL:
@@ -207,7 +207,7 @@
                 URL to a writable folder.
  @discussion    This method is most suitable for use only after the specified expired session reports have been sent to the entity that controls the use of the media data and the entity has acknowledged their receipt.
 */
-+ (void)removePendingExpiredSessionReports:(NSArray <NSDictionary *> *)expiredSessionReports withAppIdentifier:(NSData *)appIdentifier storageDirectoryAtURL:(NSURL *)storageURL;
++ (void)removePendingExpiredSessionReports:(NSArray <NSData *> *)expiredSessionReports withAppIdentifier:(NSData *)appIdentifier storageDirectoryAtURL:(NSURL *)storageURL;
 
 @end
 
diff -ruN /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFoundation.apinotes /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFoundation.apinotes
--- /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFoundation.apinotes	2017-05-23 21:01:42.000000000 -0400
+++ /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFoundation.apinotes	2017-06-12 01:08:13.000000000 -0400
@@ -147,6 +147,15 @@
     MethodKind: Class
     Availability: nonswift
     AvailabilityMsg: 'use object initializers instead'
+  - Selector: 'supportedPhotoCodecTypesForFileType:'
+    SwiftName: 'supportedPhotoCodecTypes(for:)'
+    MethodKind: Instance
+  - Selector: 'supportedPhotoPixelFormatTypesForFileType:'
+    SwiftName: 'supportedPhotoPixelFormatTypes(for:)'
+    MethodKind: Instance
+  - Selector: 'supportedRawPhotoPixelFormatTypesForFileType:'
+    SwiftName: 'supportedRawPhotoPixelFormatTypes(for:)'
+    MethodKind: Instance
 - Name: AVCaptureStillImageOutput
   Methods:
   - Selector: 'new'
@@ -165,6 +174,9 @@
   - Selector: 'recommendedVideoSettingsForAssetWriterWithOutputFileType:'
     SwiftName: 'recommendedVideoSettingsForAssetWriter(writingTo:)'
     MethodKind: Instance
+  Properties:
+  - Name: availableVideoCVPixelFormatTypes
+    SwiftName: availableVideoPixelFormatTypes
 - Name: AVCaptureVideoPreviewLayer
   Methods:
   - Selector: 'captureDevicePointOfInterestForPoint:'
@@ -518,6 +530,8 @@
   SwiftName: AVCaptureDevice.TorchMode
 - Name: AVCaptureWhiteBalanceMode
   SwiftName: AVCaptureDevice.WhiteBalanceMode
+- Name: AVDepthDataAccuracy
+  SwiftName: AVDepthData.Accuracy
 - Name: AVError
   NSErrorDomain: AVFoundationErrorDomain
 Typedefs:
@@ -1435,6 +1449,7 @@
       Nullability: [ U, U ]
     Properties:
     - Name: availableVideoCVPixelFormatTypes
+      SwiftName: availableVideoCVPixelFormatTypes
       PropertyKind: Instance
       Nullability: U
       Type: 'NSArray *'
diff -ruN /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFoundation.h /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFoundation.h
--- /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFoundation.h	2017-05-24 00:41:53.000000000 -0400
+++ /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVFoundation.h	2017-06-12 01:08:13.000000000 -0400
@@ -39,6 +39,7 @@
 
 #if TARGET_OS_IPHONE
 #import <AVFoundation/AVAssetDownloadTask.h>
+#import <AVFoundation/AVAssetDownloadStorageManager.h>
 #endif
 
 #if (TARGET_OS_IPHONE || defined(__MAC_10_7))
@@ -84,6 +85,7 @@
 #import <AVFoundation/AVPlayerLooper.h>
 #import <AVFoundation/AVPlayerMediaSelectionCriteria.h>
 #import <AVFoundation/AVQueuedSampleBufferRendering.h>
+#import <AVFoundation/AVRouteDetector.h>
 #import <AVFoundation/AVSampleBufferAudioRenderer.h>
 #import <AVFoundation/AVSampleBufferDisplayLayer.h>
 #import <AVFoundation/AVSampleBufferRenderSynchronizer.h>
diff -ruN /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVRouteDetector.h /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVRouteDetector.h
--- /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVRouteDetector.h	1969-12-31 19:00:00.000000000 -0500
+++ /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVRouteDetector.h	2017-06-12 01:08:15.000000000 -0400
@@ -0,0 +1,55 @@
+/*
+	File:  AVRouteDetector.h
+ 
+	Framework:  AVFoundation
+ 
+	Copyright 2017 Apple Inc. All rights reserved.
+ 
+ */
+
+#import <AVFoundation/AVBase.h>
+#import <Foundation/Foundation.h>
+
+@class AVRouteDetectorInternal;
+
+NS_ASSUME_NONNULL_BEGIN
+
+/*!
+ @class		AVRouteDetector
+ @abstract	AVRouteDetector detects the presence of media playback routes.
+ @discussion	
+	If route detection is enabled (it is disabled by default), AVRouteDetector reports whether or not multiple playback routes have been detected. If this is the case, AVKit's AVRoutePickerView can be used to allow users to pick from the set of available routes.
+ */
+
+API_AVAILABLE(macos(10.13), ios(11.0), tvos(11.0)) API_UNAVAILABLE(watchos)
+@interface AVRouteDetector : NSObject
+{
+@private
+	AVRouteDetectorInternal *_routeDetectorInternal;
+}
+
+/*!
+ @property	routeDetectionEnabled
+ @abstract	Whether or not route detection is enabled. The default value is NO.
+ @discussion	
+	Route detection significantly increases power consumption and must be turned off when it's no longer needed.
+ */
+@property (getter=isRouteDetectionEnabled) BOOL routeDetectionEnabled;
+
+/*!
+ @property	multipleRoutesDetected
+ @abstract	This property is YES if, in addition to the local playback route, at least one more playback route has been detected.
+ @discussion	
+	If multiple route have been detected, AVKit's AVRoutePickerView can be used to allow users to pick from the set of available routes. When the values of this property changes AVRouteDetectorMultipleRoutesDetectedDidChangeNotification is posted.
+ */
+@property (readonly) BOOL multipleRoutesDetected;
+
+/*!
+ @constant	AVRouteDetectorMultipleRoutesDetectedDidChangeNotification
+ @abstract	Posted when the value of multipleRoutesDetected changes.
+ */
+AVF_EXPORT NSNotificationName const AVRouteDetectorMultipleRoutesDetectedDidChangeNotification API_AVAILABLE(macos(10.13), ios(11.0), tvos(11.0)) API_UNAVAILABLE(watchos);
+
+@end
+
+NS_ASSUME_NONNULL_END
diff -ruN /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVTime.h /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVTime.h
--- /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVTime.h	2017-05-24 00:41:55.000000000 -0400
+++ /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVTime.h	2017-06-12 01:08:15.000000000 -0400
@@ -14,7 +14,13 @@
 
 NS_ASSUME_NONNULL_BEGIN
 
-// utilities for carriage of CoreMedia time structures in NSValues
+/*
+	Utilities for carriage of CoreMedia time structures in NSValues
+	
+	Notes for archiving NSValues created with CMTime / CMTimeRange / CMTimeMapping.
+	We recommend that on macOS Sierra, iOS 10, and later you use NSKeyedArchiver to serialize NSValues created with a CMTime, CMTimeRange, or CMTimeMapping. This will ensure that before and after archiving, use of such values with -isEqual:, with -[NSDictionary objectForKey:], and with other facilities that depend on hashing or equality, will have correct results.
+	Should it be necessary to write a value in a way that is readable pre-Sierra or pre-iOS 10, you can re-encode the values using [NSValue valueWithBytes:objCType:] before handing it to the NSArchiver. If you are reading serialized values produced with an NSArchiver, it will still successfully produce NSValues. If in this scenario you require the use of such values with -isEqual:, with -[NSDictionary objectForKey:], and with other facilities that depend on hashing or equality to have correct results, we recommend that you immediately re-create a new NSValue via the NSValue utilities provided here and use the new NSValue in place of the unarchived one.
+*/
 
 @interface NSValue (NSValueAVFoundationExtensions)
 
diff -ruN /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoSettings.h /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoSettings.h
--- /Applications/Xcode9-beta1.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoSettings.h	2017-05-24 00:28:08.000000000 -0400
+++ /Applications/Xcode9-beta2.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/AVFoundation.framework/Headers/AVVideoSettings.h	2017-06-11 18:48:52.000000000 -0400
@@ -230,6 +230,17 @@
 	AVF_EXPORT NSString *const AVVideoAverageNonDroppableFrameRateKey /* NSNumber (frames per second) */		NS_AVAILABLE(10_10, 7_0);
 
 /*!
+ @constant	AVVideoDecompressionPropertiesKey
+ @abstract
+ The value for this key is an instance of NSDictionary, containing properties to be passed down to the video decoder.
+ @discussion
+ Package the below keys in an instance of NSDictionary and use it as the value for AVVideoDecompressionPropertiesKey in the top-level AVVideoSettings dictionary.  In addition to the keys listed below, you can also include keys from VideoToolbox/VTDecompressionProperties.h.
+ 
+ Most keys can only be used for certain decoders.  Look at individual keys for details.
+ */
+AVF_EXPORT NSString *const AVVideoDecompressionPropertiesKey /* NSDictionary */   API_AVAILABLE(macos(10.13)) __IOS_PROHIBITED __TVOS_PROHIBITED __WATCHOS_PROHIBITED;
+
+/*!
 	@constant AVVideoEncoderSpecificationKey
 	@abstract
 		The video encoder specification includes options for choosing a specific video encoder.
Clone this wiki locally