diff --git a/Changelog.md b/Changelog.md new file mode 100644 index 0000000..74c5556 --- /dev/null +++ b/Changelog.md @@ -0,0 +1,204 @@ +## 0.0.17 ++ Fixed C# compiler error on iOS. + +## 0.0.16 ++ Added 32-bit `x86` architecture support on Android (#75). ++ Fixed `NullReferenceException` on startup when running on Windows (#78). ++ Fixed crash when calling `MediaAsset.SaveToCameraRoll` on macOS (#85). ++ Changed `VideoKitAudioManager.OnAudioBuffer` event type from `UnityEvent` to a plain `event` (#69). ++ Removed `MediaDeviceFilters` class. ++ Reduced minimum requirement to macOS 10.15 (#84). + +## 0.0.15 ++ Added `MediaAsset.ToValue` method for creating Function prediction values from media assets. ++ Added `VideoKitAudioManager.OnAudioBuffer` event for receiving audio buffers from audio devices. ++ Added `VideoKitCameraManager.OnCameraImage` event for receiving camera images directly from the streaming camera device. ++ Added `VideoKitCameraManager.texture` property for accessing the camera preview texture. ++ Added `VideoKitCameraManager.pixelBuffer` property for accessing the camera preview pixel buffer. ++ Added `VideoKitCameraManager.humanTexture` property for accessing the camera human texture. ++ Added `VideoKitCameraManager.imageFeature` property for accessing the camera preview as an ML feature. ++ Added `VideoKitRecordButton.OnStartRecording` event. ++ Added `VideoKitRecordButton.OnStopRecording` event. ++ Fixed `MediaAsset.path` property containing invalid characters on Windows. ++ Fixed `MediaAsset.Share` task never completing when exception is raised on Android. ++ Fixed `MediaAsset.Share` failing for apps that use Vuforia on Android. ++ Fixed `MediaAsset.SaveToCameraRoll` method failing because of missing write permissions on older versions of Android. ++ Fixed `MediaAsset.FromFile` method failing on WebGL due to URL mishandling. ++ Fixed `CameraDevice.WhiteBalanceModeSupported` always returning false for `WhiteBalanceMode.Continuous` on Android. ++ Fixed `CameraDevice.videoStabilizationMode` getter property causing hard crash on some Android devices. ++ Fixed `DllNotFoundException` when importing VideoKit in Linux editor. ++ Fixed rare crash due to frame rate setting when `CameraDevice.Discover` is invoked. ++ Fixed rare crash when recording is started when rendering with OpenGL ES3 on Android. ++ Fixed rare crash when entering play mode in the Unity Editor because the app domain is reloaded. ++ Removed `IMediaOutput` interface. ++ Removed `SampleBuffer` struct. Use `AudioBuffer` struct instead. ++ Removed `VideoKitAudioManager.OnSampleBuffer` event. Use `OnAudioBuffer` event instead. ++ Removed `CameraImage` parameter from `VideoKitCameraManager.OnCameraFrame` event. + +## 0.0.14 ++ Added audio captioning using AI with the `AudioAsset.Caption` method. ++ Added ability to parse an arbitrary `struct` from text using AI with the `TextAsset.To` method. ++ Added ability to pick images and videos from the camera roll with the `MediaAsset.FromCameraRoll` method. ++ Added `MediaAsset` class for loading, inspecting, and sharing media. ++ Added `TextAsset` class for loading, inspecting, and extracting models from text. ++ Added `ImageAsset` class for loading, modifying, and sharing images. ++ Added `VideoAsset` class for loading, inspecting, and sharing videos. ++ Added `AudioAsset` class for loading, inspecting, and sharing audio. ++ Added `MediaRecorder` class to consolidate working with recorders. ++ Added `MediaFormat` enumeration for identifying and working with media formats. ++ Added `AudioDevice.Discover` static method for discovering available microphones. ++ Added `CameraDevice.Discover` static method for discovering available cameras. ++ Added `CameraDevice.exposureDuration` property to get the current camera exposure duration in seconds. ++ Added `CameraDevice.ISO` property to get the current camera exposure sensitivity. ++ Added `VideoKitProjectSettings` class for managing VideoKit settings in the current Unity project. ++ Added `VideoKitRecorder.frameRate` property for setting the frame rate of recorded GIF images. ++ Added `VideoKitRecordButton.recorder` property for getting and setting the recorder on which the button acts. ++ Added automatic camera pausing and resuming when app is suspended and resumed in `VideoKitCameraManager`. ++ Added native sharing support on macOS. ++ Added native sharing support on WebGL for browsers that are WebShare compliant. ++ Fixed `VideoKitRecorder.Resolution._240xAuto`, `_720xAuto`, and `_1080xAuto` constants resulting in incorrect resolutions. ++ Fixed visible artifacts when recording camera that only clears depth or doesn't clear at all (#32). ++ Fixed camera permissions not being requested when calling `CameraDevice.CheckPermissions` on fresh Android app install. ++ Fixed `CameraDevice` preview stream being frozen in the Safari browser on macOS. ++ Fixed `CameraDevice` focus being lost when setting `FocusMode.Locked` on Android. ++ Fixed `mimeType not supported` exception when creating a `WEBMRecorder` in the Safari browser. ++ Fixed `std::bad_function_call` exception when `AudioDevice.StopRunning` is called on WebGL. ++ Fixed `CommitFrame` exception when recording audio to a `WAV` file with `VideoKitRecorder` class. ++ Fixed media preview in native share UI not showing when sharing an image or video on Android. ++ Updated `VideoKitCameraManager.StartRunning` method to return a `Task` that can be awaited. ++ Updated `VideoKitAudioManager.StartRunning` method to return a `Task` that can be awaited. ++ Updated `VideoKitRecorder.StartRecording` method to return a `Task` that can be awaited. ++ Updated `JPEGRecorder.FinishWriting` to return path to all recorded image files separated by `Path.PathSeparator` character. ++ Refactored `IMediaDevice` interface to `MediaDevice` class. ++ Refactored `MediaDeviceCriteria` class to `MediaDeviceFilters`. ++ Refactored `DeviceLocation` enumeration to `MediaDevice.Location`. ++ Refactored `PermissionStatus` enumeration to `MediaDevice.PermissionStatus`. ++ Refactored `VideoKitCameraManager.Capabilities.MachineLearning` enumeration member to `Capabilities.AI`. ++ Removed `IMediaRecorder` interface. Use `MediaRecorder` class instead. ++ Removed `MP4Recorder` class. Use `MediaRecorder.Create` with `MediaFormat.MP4` instead. ++ Removed `HEVCRecorder` class. Use `MediaRecorder.Create` with `MediaFormat.HEVC` instead. ++ Removed `GIFRecorder` class. Use `MediaRecorder.Create` with `MediaFormat.GIF` instead. ++ Removed `WAVRecorder` class. Use `MediaRecorder.Create` with `MediaFormat.WAV` instead. ++ Removed `WEBMRecorder` class. Use `MediaRecorder.Create` with `MediaFormat.WEBM` instead. ++ Removed `JPEGRecorder` class. Use `MediaRecorder.Create` with `MediaFormat.JPEG` instead. ++ Removed `MediaDeviceQuery` class. Use `AudioDevice.Discover` and `CameraDevice.Discover` methods. ++ Removed `SharePayload` class. Use `MediaAsset.Share` method instead. ++ Removed `SavePayload` class. Use `MediaAsset.SaveToCameraRoll` method instead. ++ Removed `AudioSpectrumOutput` class. ++ Removed `IEquatable` interface inheritance from `MediaDevice` class. ++ Removed `AudioDevice.Equals` method as audio devices no longer define a custom equality method. ++ Removed `CameraDevice.Equals` method as camera devices no longer define a custom equality method. ++ Removed `VideoKitRecorder.frameDuration` property. Use `VideoKitRecorder.frameRate` property instead. ++ Removed `VideoKitRecorder.Format` enumeration. Use `MediaFormat` enumeration instead. ++ Removed `VideoKitRecordButton.OnTouchDown` event. ++ Removed `VideoKitRecordButton.OnTouchUp` event. ++ Updated top-level namespace from `NatML.VideoKit` to `VideoKit`. ++ VideoKit now requires iOS 13+. ++ VideoKit now requires macOS 11+. + +## 0.0.13 ++ Fixed crash when rapidly switching cameras on WebGL (#23). ++ Fixed rare memory exception when discovering audio devices on WebGL (#24). ++ Fixed `VideoKitCameraManager.StopRunning` not stopping camera device on Safari (#25). + +## 0.0.12 ++ Fixed resolution and frame rate settings not being set when restarting `VideoKitCameraManager` (#19). ++ Fixed `VideoKitCameraManager` error when switching scenes in WebGL (#17). ++ Fixed `VideoKitRecorder.prepareOnAwake` setting still causing stutter on first recording (#20). + +## 0.0.11 ++ Added GPU acceleration for background removal capability in `VideoKitCameraManager` on Android. ++ Added `VideoKitCameraManager.frameRate` property for setting the camera preview frame rate. ++ Added `VideoKitRecordButton` UI prefab for building recording UIs similar to Instagram. ++ Added `VideoKitRecorder.Destination.Playback` enumeration member for immediately playing back recorded media. ++ Added help URLs to VideoKit components in the Unity inspector. ++ Fixed `VideoKitCameraManager.device` property ignoring new values when the manager is not running. ++ Fixed sporadic crash when using `HumanTexture` capability with `VideoKitCameraManager`. ++ Fixed crash when creating a `WEBMRecorder` with audio on WebGL. ++ Removed `VideoKitRecorder.OrientationMode` enumeration. ++ Removed `VideoKitRecorder.AspectMode` enumeration. + +## 0.0.10 ++ Added support for realtime background removal using machine learning. [See the docs](https://docs.videokit.ai/videokit/workflows/background). ++ Added `MatteKitPredictor` for predicting a human texture from a given image. ++ Added `VideoKitCameraManager.facing` property for specifying a desired camera facing. ++ Added `VideoKitCameraManager.Facing` enumeration for specifying a desired camera facing. ++ Fixed camera preview being vertically mirrored when streaming the front camera on Android devices. ++ Refactored `VideoKitRecorder.Resolution._2K` enumeration member to `Resolution._2560xAuto`. ++ Refactored `VideoKitRecorder.Resolution._4K` enumeration member to `Resolution._3840xAuto`. ++ Refactored `VideoKitCameraManager.cameraDevice` property to `VideoKitCameraManager.device`. ++ Refactored `VideoKitAudioManager.audioDevice` property to `VideoKitAudioManager.device`. + +## 0.0.9 ++ Upgraded to NatML 1.1. + +## 0.0.8 ++ Added `VideoKitRecorder.Resolution.Custom` resolution preset for specifying custom recording resolution. ++ Added `VideoKitRecorder.customResolution` property for setting custom recording resolution. ++ Added `VideoKitCameraView.focusMode` setting for specifying how to handle camera focus gestures. ++ Added `VideoKitCameraView.exposureMode` setting for specifying how to handle camera exposure gestures. ++ Added `VideoKitCameraView.zoomMode` setting for specifying how to handle camera zoom gestures. ++ Fixed bug where VideoKit components could not be added in the Unity 2022 editor. ++ Removed `VideoKitCameraFocus` component. Use `VideoKitCameraView.focusMode` setting instead. ++ Removed `VideoKitCameraZoom` component. Use `VideoKitCameraView.zoomMode` setting instead. + +## 0.0.7 ++ Added `VideoKitRecorder.frameSkip` property for recording every `n` frames during recording. ++ Fixed `VideoKitRecorder.StartRecording` throwing error on Android with OpenGL ES3. ++ Fixed `VideoKitRecorder` exception when stopping recording session on WebGL. ++ Fixed `NullReferenceException` in `VideoKitRecorder` when stopping recording without `audioManager` assigned. ++ Refactored `VideoKitAudioManager.SampleRate._160000` to `SampleRate._16000`. + +## 0.0.6 ++ Added `VideoKitAudioManager` component for managing streaming audio from audio devices. ++ Added `VideoKitRecorder.RecordingSession` struct for receiving richer information about a completed recording session. ++ Added `VideoKitRecorder.audioManager` property for managing recording audio from audio devices. ++ Added `VideoKitRecorder.Resolution._320x240` resolution preset. ++ Added `VideoKitRecorder.Resolution._480x320` resolution preset. ++ Fixed `VideoKitRecorder` not allowing developer to select `Destination.PromptUser` destination. ++ Fixed `VideoKitRecorder` incorrect video size orientation when using `Resolution.Screen` and `Orientation.Portrait`. ++ Refactored `VideoKitRecorder.orientation` property to `VideoKitRecorder.orientationMode`. ++ Refactored `VideoKitRecorder.aspect` property to `VideoKitRecorder.aspectMode`. ++ Refactored `VideoKitRecorder.videoKeyframeInterval` property to `VideoKitRecorder.keyframeInterval`. ++ Refactored `VideoKitCameraManager.OnFrame` event to `OnCameraFrame`. ++ Removed `VideoKitRecorder.OnRecordingFailed` event. Use `OnRecordingCompleted` event instead. + +## 0.0.5 ++ Added `VideoKitRecorder.videoBitRate` property for specifying the video bitrate for applicable formats. ++ Added `VideoKitRecorder.videoKeyframeInterval` property for specifying the keyframe interval for applicable formats. ++ Added `VideoKitRecorder.audioBitRate` property for specifying the audio bitrate for applicable formats. + +## 0.0.4 ++ Added `CropTextureInput` for recording a cropped area of the recording. ++ Added `WatermarkTextureInput` for adding a watermark to recorded videos. ++ Added `VideoKitRecorder.VideoMode.CameraDevice` video mode for recording videos directly from a camera device. ++ Added `VideoKitRecorder.destinationPathPrefix` property for specifying recording directory. ++ Added `VideoKitRecorder.Resolution._2K` resolution preset for recording at 2K WQHD. ++ Added `VideoKitRecorder.Resolution._4K` resolution preset for recording at 4K UHD. ++ Added `VideoKitCameraView.OnPresent` event to be notified when the view presents the camera preview to the user. ++ Added `VideoKitCameraFocus` UI component for focusing a camera device with tap gestures. ++ Fixed `CameraFrame.feature` property returning new feature instance on every access. ++ Refactored `MicrophoneInput` class to `AudioDeviceInput`. ++ Refactored `VideoKitRecorder.AudioMode.Microphone` enumeration member to `AudioMode.AudioDevice`. + +## 0.0.3 ++ Fixed `NullReferenceException` when running camera with `Capabilities.MachineLearning` enabled. ++ Fixed rare crash when using running camera with `Capabilities.HumanTexture` enabled. ++ Fixed recording session not being ended when `VideoKitRecorder` component is disabled or destroyed. + +## 0.0.2 ++ Added `VideoKitCameraManager.Resolution.Default` resolution preset to leave camera resolution unchanged. ++ Added `VideoKitCameraManager.Capabilities.DepthTexture` enumeration member for streaming camera depth. ++ Added `MicrophoneInput` recorder input for recording audio frames from an `AudioDevice`. ++ Added implicit conversion from `CameraFrame` to `CameraImage`. ++ Fixed `CameraFrame.image` being uninitialized in `VideoKitCameraManager.OnFrame`. ++ Refactored `VideoKitCameraManager.Play` method to `StartRunning`. ++ Refactored `VideoKitCameraManager.Stop` method to `StopRunning`. ++ Removed `CameraFrame.width` property. Use `CameraFrame.image.width` instead. ++ Removed `CameraFrame.height` property. Use `CameraFrame.image.height` instead. ++ Removed `CameraFrame.pixelBuffer` property. ++ Removed `CameraFrame.timestamp` property. ++ Removed `VideoKitCameraManager.Capabilities.PixelData` enumeration member. + +## 0.0.1 ++ First pre-release. \ No newline at end of file diff --git a/Changelog.md.meta b/Changelog.md.meta new file mode 100644 index 0000000..2de1338 --- /dev/null +++ b/Changelog.md.meta @@ -0,0 +1,7 @@ +fileFormatVersion: 2 +guid: a599cf0eb0d974e23b8d930ae1f040fa +TextScriptImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Editor.meta b/Editor.meta new file mode 100644 index 0000000..6ff72c8 --- /dev/null +++ b/Editor.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: f3b8b24e8ee574231b6a54fa22d3d0ab +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Editor/VideoKit.Editor.asmdef b/Editor/VideoKit.Editor.asmdef new file mode 100644 index 0000000..4fe2666 --- /dev/null +++ b/Editor/VideoKit.Editor.asmdef @@ -0,0 +1,18 @@ +{ + "name": "VideoKit.Editor", + "references": [ + "VideoKit.Runtime", + "NatML.Editor" + ], + "includePlatforms": [ + "Editor" + ], + "excludePlatforms": [], + "allowUnsafeCode": false, + "overrideReferences": false, + "precompiledReferences": [], + "autoReferenced": true, + "defineConstraints": [], + "versionDefines": [], + "noEngineReferences": false +} \ No newline at end of file diff --git a/Editor/VideoKit.Editor.asmdef.meta b/Editor/VideoKit.Editor.asmdef.meta new file mode 100644 index 0000000..b73d21a --- /dev/null +++ b/Editor/VideoKit.Editor.asmdef.meta @@ -0,0 +1,7 @@ +fileFormatVersion: 2 +guid: b588e0daf7a7b47e2b69ab0eae8dbd8b +AssemblyDefinitionImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Editor/VideoKitAndroidBuild.cs b/Editor/VideoKitAndroidBuild.cs new file mode 100644 index 0000000..f0243bf --- /dev/null +++ b/Editor/VideoKitAndroidBuild.cs @@ -0,0 +1,31 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Editor { + + using UnityEditor; + using UnityEditor.Build; + using UnityEditor.Build.Reporting; + using Internal; + + internal sealed class VideoKitAndroidBuild : IPreprocessBuildWithReport { + + int IOrderedCallback.callbackOrder => 0; + + void IPreprocessBuildWithReport.OnPreprocessBuild (BuildReport report) { + // Check + if (report.summary.platform != BuildTarget.Android) + return; + // Retrieve + var guids = AssetDatabase.FindAssets("videokit-androidx-core"); + if (guids.Length == 0) + return; + // Update importer + var path = AssetDatabase.GUIDToAssetPath(guids[0]); + var importer = PluginImporter.GetAtPath(path) as PluginImporter; + importer.SetCompatibleWithPlatform(BuildTarget.Android, VideoKitProjectSettings.instance.EmbedAndroidX); + } + } +} \ No newline at end of file diff --git a/Editor/VideoKitAndroidBuild.cs.meta b/Editor/VideoKitAndroidBuild.cs.meta new file mode 100644 index 0000000..5b6aeb9 --- /dev/null +++ b/Editor/VideoKitAndroidBuild.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: da9c81955aca64a56b4d5c991c8f263f +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Editor/VideoKitEditorInfo.cs b/Editor/VideoKitEditorInfo.cs new file mode 100644 index 0000000..bed0811 --- /dev/null +++ b/Editor/VideoKitEditorInfo.cs @@ -0,0 +1,16 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +using System.Reflection; +using System.Runtime.CompilerServices; +using UnityEngine; +using VideoKit.Internal; + +// Metadata +[assembly: AssemblyCompany(@"NatML Inc")] +[assembly: AssemblyTitle(@"VideoKit.Editor")] +[assembly: AssemblyVersionAttribute(VideoKitClient.Version)] +[assembly: AssemblyCopyright(@"Copyright © 2023 NatML Inc. All Rights Reserved.")] +[assembly: AssemblyIsEditorAssembly] \ No newline at end of file diff --git a/Editor/VideoKitEditorInfo.cs.meta b/Editor/VideoKitEditorInfo.cs.meta new file mode 100644 index 0000000..ae35888 --- /dev/null +++ b/Editor/VideoKitEditorInfo.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 4d26cb50168c74fcf8d43b630dc3f319 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Editor/VideoKitMenu.cs b/Editor/VideoKitMenu.cs new file mode 100644 index 0000000..93630e9 --- /dev/null +++ b/Editor/VideoKitMenu.cs @@ -0,0 +1,27 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Editor { + + using UnityEditor; + using Internal; + + internal static class VideoKitMenu { + + private const int BasePriority = 600; + + [MenuItem(@"NatML/VideoKit " + VideoKitClient.Version, false, BasePriority)] + private static void Version () { } + + [MenuItem(@"NatML/VideoKit " + VideoKitClient.Version, true, BasePriority)] + private static bool DisableVersion () => false; + + [MenuItem(@"NatML/View VideoKit Docs", false, BasePriority + 1)] + private static void OpenDocs () => Help.BrowseURL(@"https://docs.natml.ai/videokit"); + + [MenuItem(@"NatML/Open a VideoKit Issue", false, BasePriority + 2)] + private static void OpenIssue () => Help.BrowseURL(@"https://github.com/natmlx/videokit"); + } +} \ No newline at end of file diff --git a/Editor/VideoKitMenu.cs.meta b/Editor/VideoKitMenu.cs.meta new file mode 100644 index 0000000..7303413 --- /dev/null +++ b/Editor/VideoKitMenu.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 60a9620ffb88a4f8d8a711380043eb7b +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Editor/VideoKitProjectSettings.cs b/Editor/VideoKitProjectSettings.cs new file mode 100644 index 0000000..231037e --- /dev/null +++ b/Editor/VideoKitProjectSettings.cs @@ -0,0 +1,106 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Editor { + + using System; + using System.Collections.Generic; + using System.Threading.Tasks; + using UnityEngine; + using UnityEngine.Serialization; + using UnityEditor; + using Internal; + + /// + /// VideoKit settings for the current Unity project. + /// + [FilePath(@"ProjectSettings/VideoKit.asset", FilePathAttribute.Location.ProjectFolder)] + public sealed class VideoKitProjectSettings : ScriptableSingleton { + + #region --Data-- + [SerializeField, FormerlySerializedAs(@"licenseKey")] + private string accessKey = @""; + + [SerializeField] + private bool androidx = true; + + [SerializeField] + private string photoLibraryUsageDescription = @""; + #endregion + + + #region --Client API-- + /// + /// VideoKit access key. + /// + internal string AccessKey { + get => accessKey; + set { + // Check + if (value == accessKey) + return; + // Set + accessKey = value; + Save(false); + // Update + VideoKit.SetSessionToken(null); + } + } + + /// + /// Whether to embed the `androidx` support library in the build. + /// + public bool EmbedAndroidX { + get => androidx; + set { + // Check + if (value == androidx) + return; + // Set + androidx = value; + Save(false); + } + } + + /// + /// Photo library usage description presented to the user when sharing a media asset. + /// This only applies on iOS. + /// + public string PhotoLibraryUsageDescription { + get => photoLibraryUsageDescription; + set { + // Check + if (value == photoLibraryUsageDescription) + return; + // Set + photoLibraryUsageDescription = value; + Save(false); + } + } + + /// + /// Create VideoKit settings from the current project settings. + /// + internal static VideoKitSettings CreateSettings () { + var accessKey = !string.IsNullOrEmpty(instance.AccessKey) ? instance.AccessKey : VideoKitSettings.FallbackAccessKey; + var settings = ScriptableObject.CreateInstance(); + settings.accessKey = accessKey; + return settings; + } + #endregion + + + #region --Operations-- + + [InitializeOnLoadMethod] + private static void OnLoad () => VideoKitSettings.Instance = CreateSettings(); + + [InitializeOnEnterPlayMode] + private static void OnEnterPlaymodeInEditor (EnterPlayModeOptions options) => VideoKitSettings.Instance = CreateSettings(); + #endregion + } +} \ No newline at end of file diff --git a/Editor/VideoKitProjectSettings.cs.meta b/Editor/VideoKitProjectSettings.cs.meta new file mode 100644 index 0000000..233a7ad --- /dev/null +++ b/Editor/VideoKitProjectSettings.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: aa120fd9bb22540e2a7b606aecda23da +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Editor/VideoKitRecorderEditor.cs b/Editor/VideoKitRecorderEditor.cs new file mode 100644 index 0000000..db97ef1 --- /dev/null +++ b/Editor/VideoKitRecorderEditor.cs @@ -0,0 +1,97 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Editor { + + using System; + using System.Collections.Generic; + using System.Linq; + using UnityEditor; + using UnityEngine; + using static VideoKitRecorder; + using Resolution = VideoKitRecorder.Resolution; + + [CustomEditor(typeof(VideoKitRecorder)), CanEditMultipleObjects] + internal sealed class VideoKitRecorderEditor : Editor { + + private readonly Dictionary properties = new(); + + private void OnEnable () { + var propertyNames = new [] { + @"format", @"recordingAction", @"prepareOnAwake", @"videoMode", @"audioMode", @"resolution", @"customResolution", + @"cameras", @"texture", @"cameraManager", @"frameRate", @"frameSkip", @"watermarkMode", @"watermark", + @"watermarkRect", @"audioManager", @"configureAudioManager", @"audioDeviceGain", @"OnRecordingCompleted", + }; + foreach (var name in propertyNames) { + var property = serializedObject.FindProperty(name); + if (property == null) + throw new InvalidOperationException($"Cannot find property {name} in `VideoKitRecorder`"); + properties.Add(name, property); + } + } + + public override void OnInspectorGUI () { + serializedObject.Update(); + // Format + EditorGUILayout.PropertyField(properties[@"format"]); + EditorGUILayout.PropertyField(properties[@"prepareOnAwake"]); + // Video mode + var format = (MediaFormat)Enum.GetValues(typeof(MediaFormat)).GetValue(properties[@"format"].enumValueIndex); + if (format.SupportsVideo()) { + var videoMode = (VideoMode)Enum.GetValues(typeof(VideoMode)).GetValue(properties[@"videoMode"].enumValueIndex); + EditorGUILayout.PropertyField(properties[@"videoMode"]); + if (videoMode != VideoMode.None) { + // Video resolution + var resolution = (Resolution)Enum.GetValues(typeof(Resolution)).GetValue(properties[@"resolution"].enumValueIndex); + EditorGUILayout.PropertyField(properties[@"resolution"]); + if (resolution == Resolution.Custom) + EditorGUILayout.PropertyField(properties[@"customResolution"]); + // Video input + switch (videoMode) { + case VideoMode.Camera: + EditorGUILayout.PropertyField(properties[@"cameras"]); + break; + case VideoMode.Texture: + EditorGUILayout.PropertyField(properties[@"texture"]); + break; + case VideoMode.CameraDevice: + EditorGUILayout.PropertyField(properties[@"cameraManager"]); + break; + } + // Frame duration + if (format == MediaFormat.GIF) + EditorGUILayout.PropertyField(properties[@"frameRate"]); + // Frame skip + EditorGUILayout.PropertyField(properties[@"frameSkip"]); + // Watermark + var watermarkMode = (WatermarkMode)Enum.GetValues(typeof(WatermarkMode)).GetValue(properties[@"watermarkMode"].enumValueIndex); + EditorGUILayout.PropertyField(properties[@"watermarkMode"]); + if (watermarkMode != WatermarkMode.None) + EditorGUILayout.PropertyField(properties[@"watermark"]); + if (watermarkMode == WatermarkMode.Custom) + EditorGUILayout.PropertyField(properties[@"watermarkRect"]); + } + } + // Audio mode + if (format.SupportsAudio()) { + var audioMode = (AudioMode)properties[@"audioMode"].enumValueFlag; + EditorGUILayout.PropertyField(properties[@"audioMode"]); + if (audioMode.HasFlag(AudioMode.AudioDevice)) { + EditorGUILayout.PropertyField(properties[@"audioManager"]); + EditorGUILayout.PropertyField(properties[@"configureAudioManager"]); + } + if (audioMode.HasFlag(AudioMode.AudioListener) && audioMode.HasFlag(AudioMode.AudioDevice)) + EditorGUILayout.PropertyField(properties[@"audioDeviceGain"]); + } + // Events + var recordingAction = (RecordingAction)properties[@"recordingAction"].enumValueFlag; + EditorGUILayout.PropertyField(properties[@"recordingAction"]); + if (recordingAction.HasFlag(RecordingAction.Custom)) + EditorGUILayout.PropertyField(properties[@"OnRecordingCompleted"]); + // Apply + serializedObject.ApplyModifiedProperties(); + } + } +} \ No newline at end of file diff --git a/Editor/VideoKitRecorderEditor.cs.meta b/Editor/VideoKitRecorderEditor.cs.meta new file mode 100644 index 0000000..a90d1f9 --- /dev/null +++ b/Editor/VideoKitRecorderEditor.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 3d9dea6b503394ecf8f1639353ad2910 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Editor/VideoKitSettingsEmbed.cs b/Editor/VideoKitSettingsEmbed.cs new file mode 100644 index 0000000..ca3f155 --- /dev/null +++ b/Editor/VideoKitSettingsEmbed.cs @@ -0,0 +1,64 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Editor { + + using System; + using System.IO; + using System.Threading.Tasks; + using UnityEditor; + using UnityEditor.Build.Reporting; + using UnityEngine; + using NatML.Editor; + using Internal; + + internal sealed class VideoKitSettingsEmbed : BuildEmbedHelper { + + protected override BuildTarget[] SupportedTargets => new [] { + BuildTarget.Android, + BuildTarget.iOS, + BuildTarget.StandaloneOSX, + BuildTarget.StandaloneWindows, + BuildTarget.StandaloneWindows64, + BuildTarget.WebGL, + }; + private const string CachePath = @"Assets/__VIDEOKIT_DELETE_THIS__"; + + protected override VideoKitSettings[] CreateEmbeds (BuildReport report) { + var platform = ToPlatform(report.summary.platform); + var bundleId = Application.identifier; + var settings = VideoKitProjectSettings.CreateSettings(); + var client = new VideoKitClient(settings.accessKey); + // Create build token + try { + settings.buildToken = Task.Run(() => client.CreateBuildToken()).Result; + settings.sessionToken = Task.Run(() => client.CreateSessionToken(settings.buildToken, bundleId, platform)).Result; + } catch (Exception ex) { + Debug.LogWarning($"VideoKit: {ex.Message}"); + Debug.LogException(ex); + } + // Embed + Directory.CreateDirectory(CachePath); + AssetDatabase.CreateAsset(settings, $"{CachePath}/VideoKit.asset"); + return new [] { settings }; + } + + protected override void ClearEmbeds (BuildReport report) { + base.ClearEmbeds(report); + AssetDatabase.DeleteAsset(CachePath); + } + + private static string ToPlatform (BuildTarget target) => target switch { + BuildTarget.Android => "ANDROID", + BuildTarget.iOS => "IOS", + BuildTarget.StandaloneLinux64 => "LINUX", + BuildTarget.StandaloneOSX => "MACOS", + BuildTarget.StandaloneWindows => "WINDOWS", + BuildTarget.StandaloneWindows64 => "WINDOWS", + BuildTarget.WebGL => "WEB", + _ => null, + }; + } +} \ No newline at end of file diff --git a/Editor/VideoKitSettingsEmbed.cs.meta b/Editor/VideoKitSettingsEmbed.cs.meta new file mode 100644 index 0000000..190ae3d --- /dev/null +++ b/Editor/VideoKitSettingsEmbed.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 79fcaea4f39904f9cbeb58dd021e34e5 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Editor/VideoKitSettingsProvider.cs b/Editor/VideoKitSettingsProvider.cs new file mode 100644 index 0000000..77da3e0 --- /dev/null +++ b/Editor/VideoKitSettingsProvider.cs @@ -0,0 +1,32 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Editor { + + using System.Collections.Generic; + using UnityEditor; + + internal static class VideoKitSettingsProvider { + + [SettingsProvider] + public static SettingsProvider CreateProvider () => new SettingsProvider(@"Project/VideoKit", SettingsScope.Project) { + label = @"VideoKit", + guiHandler = searchContext => { + // License + EditorGUILayout.LabelField(@"VideoKit Account", EditorStyles.boldLabel); + VideoKitProjectSettings.instance.AccessKey = EditorGUILayout.TextField(@"Access Key", VideoKitProjectSettings.instance.AccessKey); + EditorGUILayout.Space(10); + // Android settings + EditorGUILayout.LabelField(@"Android Settings", EditorStyles.boldLabel); + VideoKitProjectSettings.instance.EmbedAndroidX = EditorGUILayout.Toggle(@"Embed AndroidX Library", VideoKitProjectSettings.instance.EmbedAndroidX); + EditorGUILayout.Space(10); + // iOS settings + EditorGUILayout.LabelField(@"iOS Settings", EditorStyles.boldLabel); + VideoKitProjectSettings.instance.PhotoLibraryUsageDescription = EditorGUILayout.TextField(@"Photo Library Usage Description", VideoKitProjectSettings.instance.PhotoLibraryUsageDescription); + }, + keywords = new HashSet(new[] { @"VideoKit", @"NatML", @"NatCorder", @"NatDevice", @"NatShare", @"Hub" }), + }; + } +} \ No newline at end of file diff --git a/Editor/VideoKitSettingsProvider.cs.meta b/Editor/VideoKitSettingsProvider.cs.meta new file mode 100644 index 0000000..b71a238 --- /dev/null +++ b/Editor/VideoKitSettingsProvider.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: ef16136307e7f48da8e5e99d77e02f0f +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Editor/VideoKitiOSBuild.cs b/Editor/VideoKitiOSBuild.cs new file mode 100644 index 0000000..4d31be7 --- /dev/null +++ b/Editor/VideoKitiOSBuild.cs @@ -0,0 +1,48 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Editor { + + using System.IO; + using UnityEditor; + using UnityEditor.Build; + using UnityEditor.Build.Reporting; + using Internal; + + #if UNITY_IOS + using UnityEditor.iOS.Xcode; + #endif + + internal sealed class VideoKitiOSBuild : IPostprocessBuildWithReport { + + int IOrderedCallback.callbackOrder => 0; + + void IPostprocessBuildWithReport.OnPostprocessBuild (BuildReport report) { + // Check + if (report.summary.platform != BuildTarget.iOS) + return; + // Add photo library usage description + AddPhotoLibraryUsageDescription(report); + } + + private static void AddPhotoLibraryUsageDescription (BuildReport report) { + #if UNITY_IOS + var description = VideoKitProjectSettings.instance.PhotoLibraryUsageDescription; + var outputPath = report.summary.outputPath; + if (!string.IsNullOrEmpty(description)) { + // Read plist + var plistPath = Path.Combine(outputPath, @"Info.plist"); + var plist = new PlistDocument(); + plist.ReadFromString(File.ReadAllText(plistPath)); + // Add photo library descriptions + plist.root.SetString(@"NSPhotoLibraryUsageDescription", description); + plist.root.SetString(@"NSPhotoLibraryAddUsageDescription", description); + // Write to file + File.WriteAllText(plistPath, plist.WriteToString()); + } + #endif + } + } +} \ No newline at end of file diff --git a/Editor/VideoKitiOSBuild.cs.meta b/Editor/VideoKitiOSBuild.cs.meta new file mode 100644 index 0000000..ac85663 --- /dev/null +++ b/Editor/VideoKitiOSBuild.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 97759e456d1b84edd9bccafc39bc76b0 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/LICENSE.md b/LICENSE.md new file mode 100644 index 0000000..f49a4e1 --- /dev/null +++ b/LICENSE.md @@ -0,0 +1,201 @@ + Apache License + Version 2.0, January 2004 + http://www.apache.org/licenses/ + + TERMS AND CONDITIONS FOR USE, REPRODUCTION, AND DISTRIBUTION + + 1. Definitions. + + "License" shall mean the terms and conditions for use, reproduction, + and distribution as defined by Sections 1 through 9 of this document. + + "Licensor" shall mean the copyright owner or entity authorized by + the copyright owner that is granting the License. + + "Legal Entity" shall mean the union of the acting entity and all + other entities that control, are controlled by, or are under common + control with that entity. For the purposes of this definition, + "control" means (i) the power, direct or indirect, to cause the + direction or management of such entity, whether by contract or + otherwise, or (ii) ownership of fifty percent (50%) or more of the + outstanding shares, or (iii) beneficial ownership of such entity. + + "You" (or "Your") shall mean an individual or Legal Entity + exercising permissions granted by this License. + + "Source" form shall mean the preferred form for making modifications, + including but not limited to software source code, documentation + source, and configuration files. + + "Object" form shall mean any form resulting from mechanical + transformation or translation of a Source form, including but + not limited to compiled object code, generated documentation, + and conversions to other media types. + + "Work" shall mean the work of authorship, whether in Source or + Object form, made available under the License, as indicated by a + copyright notice that is included in or attached to the work + (an example is provided in the Appendix below). + + "Derivative Works" shall mean any work, whether in Source or Object + form, that is based on (or derived from) the Work and for which the + editorial revisions, annotations, elaborations, or other modifications + represent, as a whole, an original work of authorship. For the purposes + of this License, Derivative Works shall not include works that remain + separable from, or merely link (or bind by name) to the interfaces of, + the Work and Derivative Works thereof. + + "Contribution" shall mean any work of authorship, including + the original version of the Work and any modifications or additions + to that Work or Derivative Works thereof, that is intentionally + submitted to Licensor for inclusion in the Work by the copyright owner + or by an individual or Legal Entity authorized to submit on behalf of + the copyright owner. For the purposes of this definition, "submitted" + means any form of electronic, verbal, or written communication sent + to the Licensor or its representatives, including but not limited to + communication on electronic mailing lists, source code control systems, + and issue tracking systems that are managed by, or on behalf of, the + Licensor for the purpose of discussing and improving the Work, but + excluding communication that is conspicuously marked or otherwise + designated in writing by the copyright owner as "Not a Contribution." + + "Contributor" shall mean Licensor and any individual or Legal Entity + on behalf of whom a Contribution has been received by Licensor and + subsequently incorporated within the Work. + + 2. Grant of Copyright License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + copyright license to reproduce, prepare Derivative Works of, + publicly display, publicly perform, sublicense, and distribute the + Work and such Derivative Works in Source or Object form. + + 3. Grant of Patent License. Subject to the terms and conditions of + this License, each Contributor hereby grants to You a perpetual, + worldwide, non-exclusive, no-charge, royalty-free, irrevocable + (except as stated in this section) patent license to make, have made, + use, offer to sell, sell, import, and otherwise transfer the Work, + where such license applies only to those patent claims licensable + by such Contributor that are necessarily infringed by their + Contribution(s) alone or by combination of their Contribution(s) + with the Work to which such Contribution(s) was submitted. If You + institute patent litigation against any entity (including a + cross-claim or counterclaim in a lawsuit) alleging that the Work + or a Contribution incorporated within the Work constitutes direct + or contributory patent infringement, then any patent licenses + granted to You under this License for that Work shall terminate + as of the date such litigation is filed. + + 4. Redistribution. You may reproduce and distribute copies of the + Work or Derivative Works thereof in any medium, with or without + modifications, and in Source or Object form, provided that You + meet the following conditions: + + (a) You must give any other recipients of the Work or + Derivative Works a copy of this License; and + + (b) You must cause any modified files to carry prominent notices + stating that You changed the files; and + + (c) You must retain, in the Source form of any Derivative Works + that You distribute, all copyright, patent, trademark, and + attribution notices from the Source form of the Work, + excluding those notices that do not pertain to any part of + the Derivative Works; and + + (d) If the Work includes a "NOTICE" text file as part of its + distribution, then any Derivative Works that You distribute must + include a readable copy of the attribution notices contained + within such NOTICE file, excluding those notices that do not + pertain to any part of the Derivative Works, in at least one + of the following places: within a NOTICE text file distributed + as part of the Derivative Works; within the Source form or + documentation, if provided along with the Derivative Works; or, + within a display generated by the Derivative Works, if and + wherever such third-party notices normally appear. The contents + of the NOTICE file are for informational purposes only and + do not modify the License. You may add Your own attribution + notices within Derivative Works that You distribute, alongside + or as an addendum to the NOTICE text from the Work, provided + that such additional attribution notices cannot be construed + as modifying the License. + + You may add Your own copyright statement to Your modifications and + may provide additional or different license terms and conditions + for use, reproduction, or distribution of Your modifications, or + for any such Derivative Works as a whole, provided Your use, + reproduction, and distribution of the Work otherwise complies with + the conditions stated in this License. + + 5. Submission of Contributions. Unless You explicitly state otherwise, + any Contribution intentionally submitted for inclusion in the Work + by You to the Licensor shall be under the terms and conditions of + this License, without any additional terms or conditions. + Notwithstanding the above, nothing herein shall supersede or modify + the terms of any separate license agreement you may have executed + with Licensor regarding such Contributions. + + 6. Trademarks. This License does not grant permission to use the trade + names, trademarks, service marks, or product names of the Licensor, + except as required for reasonable and customary use in describing the + origin of the Work and reproducing the content of the NOTICE file. + + 7. Disclaimer of Warranty. Unless required by applicable law or + agreed to in writing, Licensor provides the Work (and each + Contributor provides its Contributions) on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or + implied, including, without limitation, any warranties or conditions + of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A + PARTICULAR PURPOSE. You are solely responsible for determining the + appropriateness of using or redistributing the Work and assume any + risks associated with Your exercise of permissions under this License. + + 8. Limitation of Liability. In no event and under no legal theory, + whether in tort (including negligence), contract, or otherwise, + unless required by applicable law (such as deliberate and grossly + negligent acts) or agreed to in writing, shall any Contributor be + liable to You for damages, including any direct, indirect, special, + incidental, or consequential damages of any character arising as a + result of this License or out of the use or inability to use the + Work (including but not limited to damages for loss of goodwill, + work stoppage, computer failure or malfunction, or any and all + other commercial damages or losses), even if such Contributor + has been advised of the possibility of such damages. + + 9. Accepting Warranty or Additional Liability. While redistributing + the Work or Derivative Works thereof, You may choose to offer, + and charge a fee for, acceptance of support, warranty, indemnity, + or other liability obligations and/or rights consistent with this + License. However, in accepting such obligations, You may act only + on Your own behalf and on Your sole responsibility, not on behalf + of any other Contributor, and only if You agree to indemnify, + defend, and hold each Contributor harmless for any liability + incurred by, or claims asserted against, such Contributor by reason + of your accepting any such warranty or additional liability. + + END OF TERMS AND CONDITIONS + + APPENDIX: How to apply the Apache License to your work. + + To apply the Apache License to your work, attach the following + boilerplate notice, with the fields enclosed by brackets "[]" + replaced with your own identifying information. (Don't include + the brackets!) The text should be enclosed in the appropriate + comment syntax for the file format. We also recommend that a + file or class name and description of purpose be included on the + same "printed page" as the copyright notice for easier + identification within third-party archives. + + Copyright [yyyy] [name of copyright owner] + + Licensed under the Apache License, Version 2.0 (the "License"); + you may not use this file except in compliance with the License. + You may obtain a copy of the License at + + http://www.apache.org/licenses/LICENSE-2.0 + + Unless required by applicable law or agreed to in writing, software + distributed under the License is distributed on an "AS IS" BASIS, + WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. + See the License for the specific language governing permissions and + limitations under the License. \ No newline at end of file diff --git a/LICENSE.md.meta b/LICENSE.md.meta new file mode 100644 index 0000000..78d5acb --- /dev/null +++ b/LICENSE.md.meta @@ -0,0 +1,7 @@ +fileFormatVersion: 2 +guid: b861ef462dad149a98e23aab99e32893 +TextScriptImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Media.meta b/Media.meta new file mode 100644 index 0000000..1170fca --- /dev/null +++ b/Media.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 090af5c949a7c4956b5a3704bfb34501 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Media/camera-streaming.gif b/Media/camera-streaming.gif new file mode 100644 index 0000000..2c5fbb0 Binary files /dev/null and b/Media/camera-streaming.gif differ diff --git a/Media/camera-streaming.gif.meta b/Media/camera-streaming.gif.meta new file mode 100644 index 0000000..21c8b04 --- /dev/null +++ b/Media/camera-streaming.gif.meta @@ -0,0 +1,166 @@ +fileFormatVersion: 2 +guid: 5daae151366ab4197b1c7f564578d278 +TextureImporter: + internalIDToNameTable: [] + externalObjects: {} + serializedVersion: 12 + mipmaps: + mipMapMode: 0 + enableMipMap: 1 + sRGBTexture: 1 + linearTexture: 0 + fadeOut: 0 + borderMipMap: 0 + mipMapsPreserveCoverage: 0 + alphaTestReferenceValue: 0.5 + mipMapFadeDistanceStart: 1 + mipMapFadeDistanceEnd: 3 + bumpmap: + convertToNormalMap: 0 + externalNormalMap: 0 + heightScale: 0.25 + normalMapFilter: 0 + flipGreenChannel: 0 + isReadable: 0 + streamingMipmaps: 0 + streamingMipmapsPriority: 0 + vTOnly: 0 + ignoreMipmapLimit: 0 + grayScaleToAlpha: 0 + generateCubemap: 6 + cubemapConvolution: 0 + seamlessCubemap: 0 + textureFormat: 1 + maxTextureSize: 2048 + textureSettings: + serializedVersion: 2 + filterMode: 1 + aniso: 1 + mipBias: 0 + wrapU: 0 + wrapV: 0 + wrapW: 0 + nPOTScale: 1 + lightmap: 0 + compressionQuality: 50 + spriteMode: 0 + spriteExtrude: 1 + spriteMeshType: 1 + alignment: 0 + spritePivot: {x: 0.5, y: 0.5} + spritePixelsToUnits: 100 + spriteBorder: {x: 0, y: 0, z: 0, w: 0} + spriteGenerateFallbackPhysicsShape: 1 + alphaUsage: 1 + alphaIsTransparency: 0 + spriteTessellationDetail: -1 + textureType: 0 + textureShape: 1 + singleChannelComponent: 0 + flipbookRows: 1 + flipbookColumns: 1 + maxTextureSizeSet: 0 + compressionQualitySet: 0 + textureFormatSet: 0 + ignorePngGamma: 0 + applyGammaDecoding: 0 + swizzle: 50462976 + cookieLightType: 0 + platformSettings: + - serializedVersion: 3 + buildTarget: DefaultTexturePlatform + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: WebGL + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Standalone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: iPhone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Android + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Server + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + spriteSheet: + serializedVersion: 2 + sprites: [] + outline: [] + physicsShape: [] + bones: [] + spriteID: + internalID: 0 + vertices: [] + indices: + edges: [] + weights: [] + secondaryTextures: [] + nameFileIdTable: {} + mipmapLimitGroupName: + pSDRemoveMatte: 0 + userData: + assetBundleName: + assetBundleVariant: diff --git a/Media/create-access-key.gif b/Media/create-access-key.gif new file mode 100644 index 0000000..5671be8 Binary files /dev/null and b/Media/create-access-key.gif differ diff --git a/Media/create-access-key.gif.meta b/Media/create-access-key.gif.meta new file mode 100644 index 0000000..dbd7dca --- /dev/null +++ b/Media/create-access-key.gif.meta @@ -0,0 +1,166 @@ +fileFormatVersion: 2 +guid: 8b8eecc1408354723b845d7a0d6ba31d +TextureImporter: + internalIDToNameTable: [] + externalObjects: {} + serializedVersion: 12 + mipmaps: + mipMapMode: 0 + enableMipMap: 1 + sRGBTexture: 1 + linearTexture: 0 + fadeOut: 0 + borderMipMap: 0 + mipMapsPreserveCoverage: 0 + alphaTestReferenceValue: 0.5 + mipMapFadeDistanceStart: 1 + mipMapFadeDistanceEnd: 3 + bumpmap: + convertToNormalMap: 0 + externalNormalMap: 0 + heightScale: 0.25 + normalMapFilter: 0 + flipGreenChannel: 0 + isReadable: 0 + streamingMipmaps: 0 + streamingMipmapsPriority: 0 + vTOnly: 0 + ignoreMipmapLimit: 0 + grayScaleToAlpha: 0 + generateCubemap: 6 + cubemapConvolution: 0 + seamlessCubemap: 0 + textureFormat: 1 + maxTextureSize: 2048 + textureSettings: + serializedVersion: 2 + filterMode: 1 + aniso: 1 + mipBias: 0 + wrapU: 0 + wrapV: 0 + wrapW: 0 + nPOTScale: 1 + lightmap: 0 + compressionQuality: 50 + spriteMode: 0 + spriteExtrude: 1 + spriteMeshType: 1 + alignment: 0 + spritePivot: {x: 0.5, y: 0.5} + spritePixelsToUnits: 100 + spriteBorder: {x: 0, y: 0, z: 0, w: 0} + spriteGenerateFallbackPhysicsShape: 1 + alphaUsage: 1 + alphaIsTransparency: 0 + spriteTessellationDetail: -1 + textureType: 0 + textureShape: 1 + singleChannelComponent: 0 + flipbookRows: 1 + flipbookColumns: 1 + maxTextureSizeSet: 0 + compressionQualitySet: 0 + textureFormatSet: 0 + ignorePngGamma: 0 + applyGammaDecoding: 0 + swizzle: 50462976 + cookieLightType: 0 + platformSettings: + - serializedVersion: 3 + buildTarget: DefaultTexturePlatform + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: WebGL + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Standalone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: iPhone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Android + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Server + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + spriteSheet: + serializedVersion: 2 + sprites: [] + outline: [] + physicsShape: [] + bones: [] + spriteID: + internalID: 0 + vertices: [] + indices: + edges: [] + weights: [] + secondaryTextures: [] + nameFileIdTable: {} + mipmapLimitGroupName: + pSDRemoveMatte: 0 + userData: + assetBundleName: + assetBundleVariant: diff --git a/Media/human-texture.gif b/Media/human-texture.gif new file mode 100644 index 0000000..8880f4f Binary files /dev/null and b/Media/human-texture.gif differ diff --git a/Media/human-texture.gif.meta b/Media/human-texture.gif.meta new file mode 100644 index 0000000..d5af3e1 --- /dev/null +++ b/Media/human-texture.gif.meta @@ -0,0 +1,166 @@ +fileFormatVersion: 2 +guid: c53406777e7624da4871936a1e5cc80e +TextureImporter: + internalIDToNameTable: [] + externalObjects: {} + serializedVersion: 12 + mipmaps: + mipMapMode: 0 + enableMipMap: 1 + sRGBTexture: 1 + linearTexture: 0 + fadeOut: 0 + borderMipMap: 0 + mipMapsPreserveCoverage: 0 + alphaTestReferenceValue: 0.5 + mipMapFadeDistanceStart: 1 + mipMapFadeDistanceEnd: 3 + bumpmap: + convertToNormalMap: 0 + externalNormalMap: 0 + heightScale: 0.25 + normalMapFilter: 0 + flipGreenChannel: 0 + isReadable: 0 + streamingMipmaps: 0 + streamingMipmapsPriority: 0 + vTOnly: 0 + ignoreMipmapLimit: 0 + grayScaleToAlpha: 0 + generateCubemap: 6 + cubemapConvolution: 0 + seamlessCubemap: 0 + textureFormat: 1 + maxTextureSize: 2048 + textureSettings: + serializedVersion: 2 + filterMode: 1 + aniso: 1 + mipBias: 0 + wrapU: 0 + wrapV: 0 + wrapW: 0 + nPOTScale: 1 + lightmap: 0 + compressionQuality: 50 + spriteMode: 0 + spriteExtrude: 1 + spriteMeshType: 1 + alignment: 0 + spritePivot: {x: 0.5, y: 0.5} + spritePixelsToUnits: 100 + spriteBorder: {x: 0, y: 0, z: 0, w: 0} + spriteGenerateFallbackPhysicsShape: 1 + alphaUsage: 1 + alphaIsTransparency: 0 + spriteTessellationDetail: -1 + textureType: 0 + textureShape: 1 + singleChannelComponent: 0 + flipbookRows: 1 + flipbookColumns: 1 + maxTextureSizeSet: 0 + compressionQualitySet: 0 + textureFormatSet: 0 + ignorePngGamma: 0 + applyGammaDecoding: 0 + swizzle: 50462976 + cookieLightType: 0 + platformSettings: + - serializedVersion: 3 + buildTarget: DefaultTexturePlatform + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: WebGL + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Standalone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: iPhone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Android + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Server + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + spriteSheet: + serializedVersion: 2 + sprites: [] + outline: [] + physicsShape: [] + bones: [] + spriteID: + internalID: 0 + vertices: [] + indices: + edges: [] + weights: [] + secondaryTextures: [] + nameFileIdTable: {} + mipmapLimitGroupName: + pSDRemoveMatte: 0 + userData: + assetBundleName: + assetBundleVariant: diff --git a/Media/nml_icon.png b/Media/nml_icon.png new file mode 100644 index 0000000..a36f3f5 Binary files /dev/null and b/Media/nml_icon.png differ diff --git a/Media/nml_icon.png.meta b/Media/nml_icon.png.meta new file mode 100644 index 0000000..5c63a4a --- /dev/null +++ b/Media/nml_icon.png.meta @@ -0,0 +1,166 @@ +fileFormatVersion: 2 +guid: 7728882fe6960415e897bae270b67d4e +TextureImporter: + internalIDToNameTable: [] + externalObjects: {} + serializedVersion: 12 + mipmaps: + mipMapMode: 0 + enableMipMap: 1 + sRGBTexture: 1 + linearTexture: 0 + fadeOut: 0 + borderMipMap: 0 + mipMapsPreserveCoverage: 0 + alphaTestReferenceValue: 0.5 + mipMapFadeDistanceStart: 1 + mipMapFadeDistanceEnd: 3 + bumpmap: + convertToNormalMap: 0 + externalNormalMap: 0 + heightScale: 0.25 + normalMapFilter: 0 + flipGreenChannel: 0 + isReadable: 0 + streamingMipmaps: 0 + streamingMipmapsPriority: 0 + vTOnly: 0 + ignoreMipmapLimit: 0 + grayScaleToAlpha: 0 + generateCubemap: 6 + cubemapConvolution: 0 + seamlessCubemap: 0 + textureFormat: 1 + maxTextureSize: 2048 + textureSettings: + serializedVersion: 2 + filterMode: 1 + aniso: 1 + mipBias: 0 + wrapU: 0 + wrapV: 0 + wrapW: 0 + nPOTScale: 0 + lightmap: 0 + compressionQuality: 50 + spriteMode: 0 + spriteExtrude: 1 + spriteMeshType: 1 + alignment: 0 + spritePivot: {x: 0.5, y: 0.5} + spritePixelsToUnits: 100 + spriteBorder: {x: 0, y: 0, z: 0, w: 0} + spriteGenerateFallbackPhysicsShape: 1 + alphaUsage: 1 + alphaIsTransparency: 1 + spriteTessellationDetail: -1 + textureType: 0 + textureShape: 1 + singleChannelComponent: 0 + flipbookRows: 1 + flipbookColumns: 1 + maxTextureSizeSet: 0 + compressionQualitySet: 0 + textureFormatSet: 0 + ignorePngGamma: 0 + applyGammaDecoding: 0 + swizzle: 50462976 + cookieLightType: 0 + platformSettings: + - serializedVersion: 3 + buildTarget: DefaultTexturePlatform + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Standalone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Server + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Android + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: iPhone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: WebGL + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + spriteSheet: + serializedVersion: 2 + sprites: [] + outline: [] + physicsShape: [] + bones: [] + spriteID: + internalID: 0 + vertices: [] + indices: + edges: [] + weights: [] + secondaryTextures: [] + nameFileIdTable: {} + mipmapLimitGroupName: + pSDRemoveMatte: 0 + userData: + assetBundleName: + assetBundleVariant: diff --git a/Media/nml_splash.png b/Media/nml_splash.png new file mode 100644 index 0000000..defdd20 Binary files /dev/null and b/Media/nml_splash.png differ diff --git a/Media/nml_splash.png.meta b/Media/nml_splash.png.meta new file mode 100644 index 0000000..a629c88 --- /dev/null +++ b/Media/nml_splash.png.meta @@ -0,0 +1,166 @@ +fileFormatVersion: 2 +guid: 77faf95936ae84eca85161f45abf3eb3 +TextureImporter: + internalIDToNameTable: [] + externalObjects: {} + serializedVersion: 12 + mipmaps: + mipMapMode: 0 + enableMipMap: 1 + sRGBTexture: 1 + linearTexture: 0 + fadeOut: 0 + borderMipMap: 0 + mipMapsPreserveCoverage: 0 + alphaTestReferenceValue: 0.5 + mipMapFadeDistanceStart: 1 + mipMapFadeDistanceEnd: 3 + bumpmap: + convertToNormalMap: 0 + externalNormalMap: 0 + heightScale: 0.25 + normalMapFilter: 0 + flipGreenChannel: 0 + isReadable: 0 + streamingMipmaps: 0 + streamingMipmapsPriority: 0 + vTOnly: 0 + ignoreMipmapLimit: 0 + grayScaleToAlpha: 0 + generateCubemap: 6 + cubemapConvolution: 0 + seamlessCubemap: 0 + textureFormat: 1 + maxTextureSize: 2048 + textureSettings: + serializedVersion: 2 + filterMode: 1 + aniso: 1 + mipBias: 0 + wrapU: 0 + wrapV: 0 + wrapW: 0 + nPOTScale: 1 + lightmap: 0 + compressionQuality: 50 + spriteMode: 0 + spriteExtrude: 1 + spriteMeshType: 1 + alignment: 0 + spritePivot: {x: 0.5, y: 0.5} + spritePixelsToUnits: 100 + spriteBorder: {x: 0, y: 0, z: 0, w: 0} + spriteGenerateFallbackPhysicsShape: 1 + alphaUsage: 1 + alphaIsTransparency: 1 + spriteTessellationDetail: -1 + textureType: 0 + textureShape: 1 + singleChannelComponent: 0 + flipbookRows: 1 + flipbookColumns: 1 + maxTextureSizeSet: 0 + compressionQualitySet: 0 + textureFormatSet: 0 + ignorePngGamma: 0 + applyGammaDecoding: 0 + swizzle: 50462976 + cookieLightType: 0 + platformSettings: + - serializedVersion: 3 + buildTarget: DefaultTexturePlatform + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Standalone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Server + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Android + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: iPhone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: WebGL + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + spriteSheet: + serializedVersion: 2 + sprites: [] + outline: [] + physicsShape: [] + bones: [] + spriteID: + internalID: 0 + vertices: [] + indices: + edges: [] + weights: [] + secondaryTextures: [] + nameFileIdTable: {} + mipmapLimitGroupName: + pSDRemoveMatte: 0 + userData: + assetBundleName: + assetBundleVariant: diff --git a/Media/set-access-key.gif b/Media/set-access-key.gif new file mode 100644 index 0000000..8bd6079 Binary files /dev/null and b/Media/set-access-key.gif differ diff --git a/Media/set-access-key.gif.meta b/Media/set-access-key.gif.meta new file mode 100644 index 0000000..c7c2dec --- /dev/null +++ b/Media/set-access-key.gif.meta @@ -0,0 +1,166 @@ +fileFormatVersion: 2 +guid: 25315618277dd4abd9104c5b260bec2d +TextureImporter: + internalIDToNameTable: [] + externalObjects: {} + serializedVersion: 12 + mipmaps: + mipMapMode: 0 + enableMipMap: 1 + sRGBTexture: 1 + linearTexture: 0 + fadeOut: 0 + borderMipMap: 0 + mipMapsPreserveCoverage: 0 + alphaTestReferenceValue: 0.5 + mipMapFadeDistanceStart: 1 + mipMapFadeDistanceEnd: 3 + bumpmap: + convertToNormalMap: 0 + externalNormalMap: 0 + heightScale: 0.25 + normalMapFilter: 0 + flipGreenChannel: 0 + isReadable: 0 + streamingMipmaps: 0 + streamingMipmapsPriority: 0 + vTOnly: 0 + ignoreMipmapLimit: 0 + grayScaleToAlpha: 0 + generateCubemap: 6 + cubemapConvolution: 0 + seamlessCubemap: 0 + textureFormat: 1 + maxTextureSize: 2048 + textureSettings: + serializedVersion: 2 + filterMode: 1 + aniso: 1 + mipBias: 0 + wrapU: 0 + wrapV: 0 + wrapW: 0 + nPOTScale: 1 + lightmap: 0 + compressionQuality: 50 + spriteMode: 0 + spriteExtrude: 1 + spriteMeshType: 1 + alignment: 0 + spritePivot: {x: 0.5, y: 0.5} + spritePixelsToUnits: 100 + spriteBorder: {x: 0, y: 0, z: 0, w: 0} + spriteGenerateFallbackPhysicsShape: 1 + alphaUsage: 1 + alphaIsTransparency: 0 + spriteTessellationDetail: -1 + textureType: 0 + textureShape: 1 + singleChannelComponent: 0 + flipbookRows: 1 + flipbookColumns: 1 + maxTextureSizeSet: 0 + compressionQualitySet: 0 + textureFormatSet: 0 + ignorePngGamma: 0 + applyGammaDecoding: 0 + swizzle: 50462976 + cookieLightType: 0 + platformSettings: + - serializedVersion: 3 + buildTarget: DefaultTexturePlatform + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: WebGL + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Standalone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: iPhone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Android + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Server + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + spriteSheet: + serializedVersion: 2 + sprites: [] + outline: [] + physicsShape: [] + bones: [] + spriteID: + internalID: 0 + vertices: [] + indices: + edges: [] + weights: [] + secondaryTextures: [] + nameFileIdTable: {} + mipmapLimitGroupName: + pSDRemoveMatte: 0 + userData: + assetBundleName: + assetBundleVariant: diff --git a/Media/video-recording.gif b/Media/video-recording.gif new file mode 100644 index 0000000..dd06794 Binary files /dev/null and b/Media/video-recording.gif differ diff --git a/Media/video-recording.gif.meta b/Media/video-recording.gif.meta new file mode 100644 index 0000000..6248e72 --- /dev/null +++ b/Media/video-recording.gif.meta @@ -0,0 +1,166 @@ +fileFormatVersion: 2 +guid: 23d2184729b984b43845d5e792a06536 +TextureImporter: + internalIDToNameTable: [] + externalObjects: {} + serializedVersion: 12 + mipmaps: + mipMapMode: 0 + enableMipMap: 1 + sRGBTexture: 1 + linearTexture: 0 + fadeOut: 0 + borderMipMap: 0 + mipMapsPreserveCoverage: 0 + alphaTestReferenceValue: 0.5 + mipMapFadeDistanceStart: 1 + mipMapFadeDistanceEnd: 3 + bumpmap: + convertToNormalMap: 0 + externalNormalMap: 0 + heightScale: 0.25 + normalMapFilter: 0 + flipGreenChannel: 0 + isReadable: 0 + streamingMipmaps: 0 + streamingMipmapsPriority: 0 + vTOnly: 0 + ignoreMipmapLimit: 0 + grayScaleToAlpha: 0 + generateCubemap: 6 + cubemapConvolution: 0 + seamlessCubemap: 0 + textureFormat: 1 + maxTextureSize: 2048 + textureSettings: + serializedVersion: 2 + filterMode: 1 + aniso: 1 + mipBias: 0 + wrapU: 0 + wrapV: 0 + wrapW: 0 + nPOTScale: 1 + lightmap: 0 + compressionQuality: 50 + spriteMode: 0 + spriteExtrude: 1 + spriteMeshType: 1 + alignment: 0 + spritePivot: {x: 0.5, y: 0.5} + spritePixelsToUnits: 100 + spriteBorder: {x: 0, y: 0, z: 0, w: 0} + spriteGenerateFallbackPhysicsShape: 1 + alphaUsage: 1 + alphaIsTransparency: 0 + spriteTessellationDetail: -1 + textureType: 0 + textureShape: 1 + singleChannelComponent: 0 + flipbookRows: 1 + flipbookColumns: 1 + maxTextureSizeSet: 0 + compressionQualitySet: 0 + textureFormatSet: 0 + ignorePngGamma: 0 + applyGammaDecoding: 0 + swizzle: 50462976 + cookieLightType: 0 + platformSettings: + - serializedVersion: 3 + buildTarget: DefaultTexturePlatform + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: WebGL + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Standalone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: iPhone + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Android + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + - serializedVersion: 3 + buildTarget: Server + maxTextureSize: 2048 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 1 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + ignorePlatformSupport: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 0 + spriteSheet: + serializedVersion: 2 + sprites: [] + outline: [] + physicsShape: [] + bones: [] + spriteID: + internalID: 0 + vertices: [] + indices: + edges: [] + weights: [] + secondaryTextures: [] + nameFileIdTable: {} + mipmapLimitGroupName: + pSDRemoveMatte: 0 + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins.meta b/Plugins.meta new file mode 100644 index 0000000..5e2ec76 --- /dev/null +++ b/Plugins.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 2b9902a129d4b45b58fdbd5cfaab9962 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Android.meta b/Plugins/Android.meta new file mode 100644 index 0000000..a104996 --- /dev/null +++ b/Plugins/Android.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 765f82d2d2f114bf0917473b37c38da4 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Android/VideoKit.aar b/Plugins/Android/VideoKit.aar new file mode 100644 index 0000000..1af6a4b Binary files /dev/null and b/Plugins/Android/VideoKit.aar differ diff --git a/Plugins/Android/VideoKit.aar.meta b/Plugins/Android/VideoKit.aar.meta new file mode 100644 index 0000000..16ef4ba --- /dev/null +++ b/Plugins/Android/VideoKit.aar.meta @@ -0,0 +1,32 @@ +fileFormatVersion: 2 +guid: 5e537df6cfd5a419da213a7279af0581 +PluginImporter: + externalObjects: {} + serializedVersion: 2 + iconMap: {} + executionOrder: {} + defineConstraints: [] + isPreloaded: 0 + isOverridable: 1 + isExplicitlyReferenced: 0 + validateReferences: 1 + platformData: + - first: + Android: Android + second: + enabled: 1 + settings: {} + - first: + Any: + second: + enabled: 0 + settings: {} + - first: + Editor: Editor + second: + enabled: 0 + settings: + DefaultValueInitialized: true + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Android/videokit-androidx-core-1.8.0.aar b/Plugins/Android/videokit-androidx-core-1.8.0.aar new file mode 100644 index 0000000..c68ebd2 Binary files /dev/null and b/Plugins/Android/videokit-androidx-core-1.8.0.aar differ diff --git a/Plugins/Android/videokit-androidx-core-1.8.0.aar.meta b/Plugins/Android/videokit-androidx-core-1.8.0.aar.meta new file mode 100644 index 0000000..77d2ff9 --- /dev/null +++ b/Plugins/Android/videokit-androidx-core-1.8.0.aar.meta @@ -0,0 +1,32 @@ +fileFormatVersion: 2 +guid: fc013291d33c445489c03c6588666108 +PluginImporter: + externalObjects: {} + serializedVersion: 2 + iconMap: {} + executionOrder: {} + defineConstraints: [] + isPreloaded: 0 + isOverridable: 1 + isExplicitlyReferenced: 0 + validateReferences: 1 + platformData: + - first: + Android: Android + second: + enabled: 1 + settings: {} + - first: + Any: + second: + enabled: 0 + settings: {} + - first: + Editor: Editor + second: + enabled: 0 + settings: + DefaultValueInitialized: true + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Managed.meta b/Plugins/Managed.meta new file mode 100644 index 0000000..cf808b1 --- /dev/null +++ b/Plugins/Managed.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: ee0b56c0653fd480abff5bf283fdfd0e +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Managed/LICENSE b/Plugins/Managed/LICENSE new file mode 100644 index 0000000..08a4275 --- /dev/null +++ b/Plugins/Managed/LICENSE @@ -0,0 +1,20 @@ +The MIT License (MIT) + +Copyright (c) 2016 Rico Suter + +Permission is hereby granted, free of charge, to any person obtaining a copy of +this software and associated documentation files (the "Software"), to deal in +the Software without restriction, including without limitation the rights to +use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of +the Software, and to permit persons to whom the Software is furnished to do so, +subject to the following conditions: + +The above copyright notice and this permission notice shall be included in all +copies or substantial portions of the Software. + +THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR +IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, FITNESS +FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE AUTHORS OR +COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER LIABILITY, WHETHER +IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, OUT OF OR IN +CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE SOFTWARE. \ No newline at end of file diff --git a/Plugins/Managed/LICENSE.meta b/Plugins/Managed/LICENSE.meta new file mode 100644 index 0000000..6967d21 --- /dev/null +++ b/Plugins/Managed/LICENSE.meta @@ -0,0 +1,7 @@ +fileFormatVersion: 2 +guid: df8e4d4b1b1cf4553b14e3f20fe357ad +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Managed/NJsonSchema.dll b/Plugins/Managed/NJsonSchema.dll new file mode 100644 index 0000000..c4da7cf Binary files /dev/null and b/Plugins/Managed/NJsonSchema.dll differ diff --git a/Plugins/Managed/NJsonSchema.dll.meta b/Plugins/Managed/NJsonSchema.dll.meta new file mode 100644 index 0000000..0953430 --- /dev/null +++ b/Plugins/Managed/NJsonSchema.dll.meta @@ -0,0 +1,33 @@ +fileFormatVersion: 2 +guid: 8093a390c342147938f1985ae86d04ea +PluginImporter: + externalObjects: {} + serializedVersion: 2 + iconMap: {} + executionOrder: {} + defineConstraints: [] + isPreloaded: 0 + isOverridable: 1 + isExplicitlyReferenced: 0 + validateReferences: 1 + platformData: + - first: + Any: + second: + enabled: 1 + settings: {} + - first: + Editor: Editor + second: + enabled: 0 + settings: + DefaultValueInitialized: true + - first: + Windows Store Apps: WindowsStoreApps + second: + enabled: 0 + settings: + CPU: AnyCPU + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Managed/NJsonSchema.xml b/Plugins/Managed/NJsonSchema.xml new file mode 100644 index 0000000..c26a30b --- /dev/null +++ b/Plugins/Managed/NJsonSchema.xml @@ -0,0 +1,2798 @@ + + + + NJsonSchema + + + + Indicates that the value of the marked element is nullable. + + + Interface to add an extension data property to a class or property, implementation needs to inherit from System.Attribute. + + + Gets the property name. + + + Gets the value. + + + Annotation to specify that array items or dictionary values are nullable. + + + Annotation to merge all inherited properties into this class/schema. + + + Initializes a new instance of the class. + + + Initializes a new instance of the class. + The explicit flag to override the global setting (i.e. disable the generation for a type). + + + Gets or sets a value indicating whether to set the x-abstract property for given type. + + + Annotation to specify the JSON Schema type for the given class. + + + Initializes a new instance of the class. + + + Initializes a new instance of the class. + The identifier of the schema which is used as key in the 'definitions' list. + + + Initializes a new instance of the class. + The JSON Schema type. + + + Gets or sets the name identifier of the schema which is used as key in the 'definitions' list. + + + Gets the JSON Schema type (default: , i.e. derived from ). + + + Gets or sets the JSON format type (default: null, i.e. derived from ). + + + Gets or sets the array item type. + + + Annotation to mark a property or class as string type with format 'date'. + + + Initializes a new instance of the class. + + + Adds an extension data property to a class or property. + + + + Initializes a new instance of the class. + The key. + The value. + + + Gets the property name. + + + Gets the value. + + + Annotation to merge all inherited properties into this class/schema. + + + Initializes a new instance of the class. + + + Initializes a new instance of the class. + The explicit flag to override the global setting (i.e. disable the generation for a type). + + + Gets or sets a value indicating whether to flatten the given type. + + + Indicates that the marked class is ignored during the JSON Schema generation. + + + Annotation to specify the JSON Schema pattern properties. + + + Initializes a new instance of the class. + The pattern property regular expression. + + + Initializes a new instance of the class. + The pattern property regular expression. + The pattern properties type. + + + Gets the pattern properties regular expression. + + + Gets the pattern properties type. + + + Registers an schema processor for the given class. + + + + Initializes a new instance of the class. + The schema processor type (must implement ISchemaProcessor). + The parameters. + + + Gets or sets the type of the operation processor (must implement ISchemaProcessor). + + + Gets or sets the type of the constructor parameters. + + + Specifies the type to use for JSON Schema generation. + + + Initializes a new instance of the class. + The type of the schema. + + + Gets or sets the response type. + + + Gets or sets a value indicating whether the schema can be null (default: null = unchanged). + + + Gets the raw nullable information. + + + Attribute to set the multipleOf parameter of a JSON Schema. + + + Initializes a new instance of the class. + The multipleOf value. + + + Initializes a new instance of the class. + The multipleOf value. + + + Gets the value whose modulo the the JSON value must be zero. + + + Indicates that the value of the marked element could never be null. + + + + Performance helpers avoiding struct enumerator building and generally faster accessing. + + + + An implementation of an observable dictionary. + The type of the key. + The type of the value. + + + Initializes a new instance of the class. + + + Initializes a new instance of the class. + The dictionary to initialize this dictionary. + + + Initializes a new instance of the class. + The comparer. + + + Initializes a new instance of the class. + The capacity. + + + Initializes a new instance of the class. + The dictionary to initialize this dictionary. + The comparer. + + + Initializes a new instance of the class. + The capacity. + The comparer. + + + Adds multiple key-value pairs the the dictionary. + The key-value pairs. + + + Inserts a key-value pair into the dictionary. + The key. + The value. + If true and key already exists then an exception is thrown. + + + Provides name conversion utility methods. + + + Converts the first letter to lower case and dashes to camel case. + The input. + Specifies whether to add an _ when the first character is not alpha. + The converted input. + + + Converts the first letter to upper case and dashes to camel case. + The input. + Specifies whether to add an _ when the first character is not alpha. + The converted input. + + + Converts the string to a string literal which can be used in C# or TypeScript code. + The input. + The literal. + + + Converts the input to a camel case identifier. + The input. + The converted input. + + + Trims white spaces from the text. + The text. + The updated text. + + + Removes the line breaks from the text. + The text. + The updated text. + + + Singularizes the given noun in plural. + The plural noun. + The singular noun. + + + Add tabs to the given string. + The input. + The tab count. + The output. + + + Add tabs to the given string. + The input. + The tab count. + Stream to write transformed content into. + The output. + + + Converts all line breaks in a string into '\n' and removes white spaces. + The input. + The tab count. + The output. + + + A converter to correctly serialize exception objects. + + + Initializes a new instance of the class. + + + Initializes a new instance of the class. + If set to true the serializer hides stack trace (i.e. sets the StackTrace to 'HIDDEN'). + The namespaces to search for exception types. + + + Gets a value indicating whether this can write JSON. + + + Writes the JSON representation of the object. + The to write to. + The value. + The calling serializer. + + + Determines whether this instance can convert the specified object type. + Type of the object. + true if this instance can convert the specified object type; otherwise, false. + + + Reads the JSON representation of the object. + The to read from. + Type of the object. + The existing value of object being read. + The calling serializer. + The object value. + + + Defines a child class in the inheritance chain. + + + Initializes a new instance of the class. + The discriminator key. + The child class type. + + + Gets the discriminator key. + + + Gets the child class type. + + + Defines the class as inheritance base class and adds a discriminator property to the serialized object. + + + Gets the default discriminiator name. + + + Initializes a new instance of the class. + + + Initializes a new instance of the class. + The discriminator. + + + Initializes a new instance of the class. + The discriminator property name. + Read the $type property to determine the type + (fallback, should not be used as it might lead to security problems). + + + Initializes a new instance of the class which only applies for the given base type. + Use this constructor for global registered converters (not defined on class). + The base type. + + + Initializes a new instance of the class which only applies for the given base type. + Use this constructor for global registered converters (not defined on class). + The base type. + The discriminator. + + + Initializes a new instance of the class which only applies for the given base type. + Use this constructor for global registered converters (not defined on class). + The base type. + The discriminator. + Read the $type property to determine the type + (fallback, should not be used as it might lead to security problems). + + + Gets the discriminator property name. + + + Writes the JSON representation of the object. + The to write to. + The value. + The calling serializer. + + + Gets a value indicating whether this can write JSON. + + + Gets a value indicating whether this can read JSON. + + + Determines whether this instance can convert the specified object type. + Type of the object. + true if this instance can convert the specified object type; otherwise, false. + + + Reads the JSON representation of the object. + The to read from. + Type of the object. + The existing value of object being read. + The calling serializer. + The object value. + + + Gets the discriminator value for the given type. + The object type. + The discriminator value. + + + Gets the type for the given discriminator value. + The JSON object. + The object (base) type. + The discriminator value. + + + + Regenerates reference paths and correctly generates $ref properties. + + + Gets a value indicating whether this can write JSON. + + + Determines whether this instance can convert the specified object type. + Type of the object. + true if this instance can convert the specified object type; otherwise, false. + + + Reads the JSON representation of the object. + The to read from. + Type of the object. + The existing value of object being read. + The calling serializer. + The object value. + + + Writes the JSON representation of the object. + The to write to. + The value. + The calling serializer. + + + Converts the last part of the full type name to upper case. + + + Gets or sets the reserved names. + + + Gets the name mappings. + + + Generates the type name for the given schema respecting the reserved type names. + The schema. + The type name hint. + The reserved type names. + The type name. + + + Generates the type name for the given schema. + The schema. + The type name hint. + The type name. + + + + Replaces all characters that are not normals letters, numbers or underscore, with an underscore. + Will prepend an underscore if the first characters is a number. + In case there are this would result in multiple underscores in a row, strips down to one underscore. + Will trim any underscores at the end of the type name. + + + + The default reflection service implementation. + + + Creates a from a . + The type. + The settings. + The . + + + Creates a from a . + The type. + The default reference type null handling used when no nullability information is available. + The settings. + The . + + + Checks whether a type is nullable. + The type. + The default reference type null handling used when no nullability information is available. + true if the type can be null. + + + Checks whether the give type is a string enum. + The type. + The serializer settings. + The result. + + + Checks whether the given type is a file/binary type. + The type. + true or false. + + + Checks whether the given type is an IAsyncEnumerable type. + + See this issue: https://github.com/RicoSuter/NSwag/issues/2582#issuecomment-576165669 + + The type. + true or false. + + + Checks whether the given type is an array type. + The type. + true or false. + + + Checks whether the given type is a dictionary type. + The type. + true or false. + + + The default schema name generator implementation. + + + Generates the name of the JSON Schema. + The type. + The new name. + + + Defines the enum handling. + + + Generates an integer field without enumeration (except when using StringEnumConverter). + + + Generates a string field with JSON Schema enumeration. + + + Generates a camel-cased string field with JSON Schema enumeration. + + + Provides methods to reflect on types. + + + Creates a from a . + The type. + The default reference type null handling. + The settings. + The . + + + Creates a from a . + The type. + The settings. + The . + + + Checks whether a type is nullable. + The type. + The default reference type null handling used when no nullability information is available. + true if the type can be null. + + + Checks whether the give type is a string enum. + The type. + The serializer settings. + The result. + + + The schema name generator. + + + Generates the name of the JSON Schema for the given type. + The type. + The new name. + + + The schema processor interface. + + + Processes the specified JSON Schema. + The schema context. + + + The XML Docs related settings. + + + Gets or sets a value indicating whether to read XML Docs (default: true). + + + Gets or sets a value indicating whether tho resolve the XML Docs from the NuGet cache or .NET SDK directory (default: true). + + + Gets or sets the XML Docs formatting (default: None). + + + Generates a object for a given type. + + + Initializes a new instance of the class. + The settings. + + + Gets the settings. + + + Generates a object for the given type and adds the mapping to the given resolver. + The type. + The schema. + Could not find value type of dictionary type. + + + Generates a object for the given type and adds the mapping to the given resolver. + The type. + The schema resolver. + The schema. + Could not find value type of dictionary type. + + + Generates a object for the given type and adds the mapping to the given resolver. + The type. + The schema resolver. + The schema. + Could not find value type of dictionary type. + + + Generates a object for the given type and adds the mapping to the given resolver. + The type. + The schema resolver. + The schema. + Could not find value type of dictionary type. + + + Generates a object for the given type and adds the mapping to the given resolver. + The type. + The schema resolver. + The schema. + Could not find value type of dictionary type. + + + Generates into the given object for the given type and adds the mapping to the given resolver. + The type of the schema. + The schema. + The type. + The schema resolver. + The schema. + Could not find value type of dictionary type. + + + Generates into the given object for the given type and adds the mapping to the given resolver. + The type of the schema. + The schema. + The type. + The schema resolver. + The schema. + Could not find value type of dictionary type. + + + Generetes a schema directly or referenced for the requested schema type; + does NOT change nullability. + The resulted schema type which may reference the actual schema. + The type of the schema to generate. + The schema resolver. + An action to transform the resulting schema (e.g. property or parameter) before the type of reference is determined (with $ref or allOf/oneOf). + The requested schema object. + + + Generetes a schema directly or referenced for the requested schema type; + also adds nullability if required by looking at the type's . + The resulted schema type which may reference the actual schema. + The type of the schema to generate. + The schema resolver. + An action to transform the resulting schema (e.g. property or parameter) before the type of reference is determined (with $ref or allOf/oneOf). + The requested schema object. + + + Generetes a schema directly or referenced for the requested schema type; also adds nullability if required. + The resulted schema type which may reference the actual schema. + The type of the schema to generate. + Specifies whether the property, parameter or requested schema type is nullable. + The schema resolver. + An action to transform the resulting schema (e.g. property or parameter) before the type of reference is determined (with $ref or allOf/oneOf). + The requested schema object. + + + Gets the converted property name. + The property. + The accessor info. + The property name. + + + Applies the property annotations to the JSON property. + The schema. + The property type description. + + + Gets the actual default value for the given object (e.g. correctly converts enums). + The value type. + The default value. + The converted default value. + + + Generates the example from the type's xml docs. + The type. + The JToken or null. + + + Generates the example from the accesor's xml docs. + The accessor info. + The JToken or null. + + + Generates the properties for the given type and schema. + The properties + The type description. + The schema resolver. + The task. + + + Gets the properties of the given type or null to take all properties. + The type. + The property names or null for all. + + + Generates an array in the given schema. + The schema type. + The schema. + The type description. + The schema resolver. + + + Generates an array in the given schema. + The schema type. + The schema. + The type description. + The schema resolver. + Could not find value type of dictionary type. + + + Generates an enumeration in the given schema. + The schema. + The type description. + + + Checks whether a property is ignored. + The accessor info. + The properties parent type. + The result. + + + The JSON Schema generator settings. + + + Initializes a new instance of the class. + + + Gets or sets the default reference type null handling when no nullability information is available (default: Null). + + + Gets or sets the default reference type null handling of dictionary value types when no nullability information is available (default: NotNull). + + + Gets or sets a value indicating whether to generate abstract properties (i.e. interface and abstract properties. Properties may defined multiple times in a inheritance hierarchy, default: false). + + + Gets or sets a value indicating whether to flatten the inheritance hierarchy instead of using allOf to describe inheritance (default: false). + + + Gets or sets a value indicating whether to generate the x-abstract flag on schemas (default: true). + + + Gets or sets a value indicating whether to generate schemas for types in attributes (default: true). + + + Gets or sets a value indicating whether to generate xmlObject representation for definitions (default: false). + + + Gets or sets a value indicating whether to ignore properties with the . + + + Gets or sets a value indicating whether to use $ref references even if additional properties are + defined on the object (otherwise allOf/oneOf with $ref is used, default: false). + + + Gets or sets a value indicating whether to generate a description with number to enum name mappings (for integer enums only, default: false). + + + Will set `additionalProperties` on all added schema definitions and references(default: false). + + + Gets or sets a value indicating whether to generate the example property of the schemas based on the <example> xml docs entry as JSON (requires to be true, default: true). + + + Gets or sets the schema type to generate (default: JsonSchema). + + + Gets or sets the Newtonsoft JSON serializer settings. + , and will be ignored. + + + Gets or sets the System.Text.Json serializer options. + , and will be ignored. + + + Gets or sets the excluded type names (same as ). + + + Gets or sets a value indicating whether to read XML Docs (default: true). + + + Gets or sets a value indicating whether tho resolve the XML Docs from the NuGet cache or .NET SDK directory (default: true). + + + Gets or sets the XML Docs formatting (default: None). + + + Gets or sets the type name generator. + + + Gets or sets the schema name generator. + + + Gets or sets the reflection service. + + + Gets or sets the type mappings. + + + Gets or sets the schema processors. + + + Gets or sets a value indicating whether to generate x-nullable properties (Swagger 2 only). + + + Gets or sets the contract resolver. + will be ignored. + + + Gets or sets the default property name handling (default: Default). + + + Gets or sets the default enum handling (default: Integer). + + + Gets the contract resolver. + The contract resolver. + A setting is misconfigured. + + + Gets the serializer settings. + A setting is misconfigured. + + + Gets the contract for the given type. + The type. + The contract. + + + Gets the actual computed setting based on the global setting and the JsonSchemaAbstractAttribute attribute. + The type. + The result. + + + Gets the actual computed setting based on the global setting and the JsonSchemaFlattenAttribute attribute. + The type. + The result. + + + Manager which resolves types to schemas and appends missing schemas to the root object. + + + Initializes a new instance of the class. + The root schema. + The settings. + + + Determines whether the specified type has a schema. + The type. + Specifies whether the type is an integer enum. + true when the mapping exists. + + + Gets the schema for a given type. + The type. + Specifies whether the type is an integer enum. + The schema. + + + Adds a schema to type mapping. + The type. + Specifies whether the type is an integer enum. + The schema. + Added schema is not a JsonSchema4 instance. + + + Gets all the schemas. + + + + Gets the mapping key for the given type. + + The type. + Specifies whether the type is an integer enum. + The mapping key. + + + Gets JSON information about a .NET type. + + + Creates a description for a primitive type or object. + The type. + The JSON type. + Specifies whether the type is nullable. + The format string (may be null). + The description. + + + Creates a description for a dictionary. + The type. + The JSON type. + Specifies whether the type is nullable. + The description. + + + Creates a description for an enumeration. + The type. + The JSON type. + Specifies whether the type is nullable. + The description. + + + Gets the actual contextual type. + + + Gets the type. + + + Gets a value indicating whether the object is a generic dictionary. + + + Gets a value indicating whether the type is an enum. + + + Gets the format string. + + + Gets or sets a value indicating whether the type is nullable. + + + Gets a value indicating whether this is a complex type (i.e. object, dictionary or array). + + + Gets a value indicating whether this is an any type (e.g. object). + + + Specifices whether the type requires a reference. + The type mappers. + true or false. + + + Applies the type and format to the given schema. + The JSON schema. + + + Defines the property name handling. + + + Generates property name using reflection (respecting the and ). + + + Generates lower camel cased property names using . + + + Generates snake cased property names using . + + + Specifies the default null handling for reference types when no nullability information is available. + + + Reference types are nullable by default (C# default). + + + Reference types cannot be null by default. + + + Generates a sample JSON object from a JSON Schema. + + + + Initializes a new instance of class with default settings.. + + + + + Initializes a new instance of class. + + The settings to use. + + + Generates a sample JSON object from a JSON Schema. + The JSON Schema. + The JSON token. + + + Settings for generating sample json data. + + + Gets or sets a value indicating whether to generate optional properties (default: true). + + + Gets or sets a value indicating the max level of recursion the generator is allowed to perform (default: 3) + + + Generates a JSON Schema from sample JSON data. + + + Generates the JSON Schema for the given JSON data. + The JSON data. + The JSON Schema. + + + Generates the JSON Schema for the given JSON data. + The JSON data stream. + The JSON Schema. + + + The schema processor context. + + + Initializes a new instance of the class. + The source contextual type. + The JSON Schema. + The resolver. + The generator. + The settings. + + + The source type. + + + The source contextual type. + + + The JSON Schema to process. + + + The JSON Schema resolver. + + + Gets the JSON Schema generator. + + + Gets the settings. + + + + Utility methods for dealing with System.Text.Json. + + + + + Converst System.Text.Json serializer options to Newtonsoft JSON settings. + + The options. + The settings. + + + Maps .NET type to a generated JSON Schema. + + + Gets the mapped type. + + + Gets a value indicating whether to use a JSON Schema reference for the type. + + + Gets the schema for the mapped type. + The schema. + The context. + + + Maps .NET type to a generated JSON Schema describing an object. + + + Initializes a new instance of the class. + Type of the mapped. + The schema. + + + Initializes a new instance of the class. + Type of the mapped. + The schema factory. + + + Gets the mapped type. + + + Gets a value indicating whether to use a JSON Schema reference for the type. + + + Gets the schema for the mapped type. + The schema. + The context. + + + Maps .NET type to a generated JSON Schema describing a primitive value. + + + Initializes a new instance of the class. + Type of the mapped. + The transformer. + + + Gets the mapped type. + + + Gets a value indicating whether to use a JSON Schema reference for the type. + + + Gets the schema for the mapped type. + The schema. + The context. + + + The context object for the interface. + + + Initializes a new instance of the class. + + + The source type. + + + The . + + + The . + + + The parent properties (e.g. the attributes on the property). + + + Provides a property to get a documents path or base URI. + + + Gets the document path (URI or file path). + + + The base JSON interface with extension data. + + + Gets or sets the extension data (i.e. additional properties which are not directly defined by the JSON object). + + + Provides dynamic access to framework APIs. + + + Gets a value indicating whether file APIs are available. + + + Gets a value indicating whether path APIs are available. + + + Gets a value indicating whether path APIs are available. + + + Gets a value indicating whether XPath APIs are available. + + + Gets a value indicating whether WebClient APIs are available. + + + Request the given URL via HTTP. + The URL. + The cancellation token. + The content. + The HttpClient.GetAsync API is not available on this platform. + + + Gets the current working directory. + The directory path. + The System.IO.Directory API is not available on this platform. + + + Gets the current working directory. + The directory path. + The System.IO.Directory API is not available on this platform. + + + Gets the files of the given directory. + The directory. + The filter. + The file paths. + The System.IO.Directory API is not available on this platform. + + + Retrieves the parent directory of the specified path, including both absolute and relative paths.. + The directory path. + The System.IO.Directory API is not available on this platform. + + + Creates a directory. + The directory. + The System.IO.Directory API is not available on this platform. + + + Checks whether a directory exists. + The file path. + true or false + The System.IO.Directory API is not available on this platform. + + + Checks whether a file exists. + The file path. + true or false + The System.IO.File API is not available on this platform. + + + Reads all content of a file (UTF8 with or without BOM). + The file path. + The file content. + The System.IO.File API is not available on this platform. + + + Writes text to a file (UTF8 without BOM). + The file path. + The text. + + The System.IO.File API is not available on this platform. + + + Combines two paths. + The path1. + The path2. + The combined path. + The System.IO.Path API is not available on this platform. + + + Gets the file name from a given path. + The file path. + The directory name. + The System.IO.Path API is not available on this platform. + + + Gets the full path from a given path + The path + The full path + The System.IO.Path API is not available on this platform. + + + Gets the directory path of a file path. + The file path. + The directory name. + The System.IO.Path API is not available on this platform. + + + Evaluates the XPath for a given XML document. + The document. + The path. + The value. + The System.Xml.XPath.Extensions API is not available on this platform. + + + + Handle cases of specs in subdirectories having external references to specs also in subdirectories + + + + + + + The JSON Schema serialization context holding information about the current serialization. + + + Gets or sets the current schema type. + + + Gets the current serializer settings. + + + Gets or sets a value indicating whether the object is currently converted to JSON. + + + Serializes an object to a JSON string with reference handling. + The object to serialize. + The schema type. + The contract resolver. + The formatting. + The JSON. + + + Deserializes JSON data to a schema with reference handling. + The JSON data. + The schema type. + The document path. + The reference resolver factory. + The contract resolver. + The deserialized schema. + + + Deserializes JSON data to a schema with reference handling. + The JSON data. + The schema type. + The document path. + The reference resolver factory. + The contract resolver. + The cancellation token + The deserialized schema. + + + Deserializes JSON data to a schema with reference handling. + The JSON data stream. + The schema type. + The document path. + The reference resolver factory. + The contract resolver. + The cancellation token + The deserialized schema. + + + Deserializes JSON data with the given contract resolver. + The JSON data. + The contract resolver. + The deserialized schema. + + + Deserializes JSON data with the given contract resolver. + The JSON data stream. + The contract resolver. + The deserialized schema. + + + JsonConvert resolver that allows to ignore and rename properties for given types. + + + Initializes a new instance of the class. + + + Ignore the given property/properties of the given type. + The type. + One or more JSON properties to ignore. + + + Rename a property of the given type. + The type. + The JSON property name to rename. + The new JSON property name. + + + Creates a JsonProperty for the given System.Reflection.MemberInfo. + The member's parent Newtonsoft.Json.MemberSerialization. + The member to create a JsonProperty for. + A created JsonProperty for the given System.Reflection.MemberInfo. + + + Provides extension methods for reading contextual type names and descriptions. + + + Gets the name of the property for JSON serialization. + The name. + + + Gets the description of the given member (based on the DescriptionAttribute, DisplayAttribute or XML Documentation). + The member info + The XML Docs settings. + The description or null if no description is available. + + + Gets the description of the given member (based on the DescriptionAttribute, DisplayAttribute or XML Documentation). + The accessor info. + The XML Docs settings. + The description or null if no description is available. + + + Gets the description of the given member (based on the DescriptionAttribute, DisplayAttribute or XML Documentation). + The parameter. + The XML Docs settings. + The description or null if no description is available. + + + Extension methods to help out generating XMLObject structure to schema. + + + Generate XML object for a JSON Schema definition. + The JSON Schema. + The type of the JSON Schema. + + + Generates an XML object for a JSON Schema definition. + The JSON Schema + + + Generates XMLObject structure for an array with primitive types + The JSON Schema of the item. + The item type. + + + Generates XMLObject structure for a property. + The JSON Schema for the property + The type. + The property name. + + + type.Name is used int will return Int32, string will return String etc. + These are not valid with how the XMLSerializer performs. + + + Generates the type name for a given . + + + Generates the type name. + The property. + The type name hint (the property name or definition key). + The reserved type names. + The new name. + + + The base JSON class with extension data. + + + Gets or sets the extension data (i.e. additional properties which are not directly defined by the JSON object). + + + Deserializes all JSON Schemas in the extension data property. + + + Transforms the extension data so that contained schemas are correctly deserialized. + The extension object. + The serializer. + + + Class containing the constants available as format string. + + + Format for a . + + + Non-standard Format for a duration (time span). + + + Format for a duration (time span) as of 2019-09 . + + + Format for an email. + + + Format for an URI. + + + Format for an GUID. + + + Format for an UUID (same as GUID). + + + Format for an integer. + + + Format for a long integer. + + + Format for a double number. + + + Format for a float number. + + + Format for a decimal number. + + + Format for an IP v4 address. + + + Format for an IP v6 address. + + + Format for binary data encoded with Base64. + Should not be used. Prefer using Byte property of + + + Format for a byte if used with numeric type or for base64 encoded value otherwise. + + + Format for a binary value. + + + Format for a hostname (DNS name). + + + Format for a phone number. + + + Format for a full date per RFC3339 Section 5.6. + + + Format for a full time per RFC3339 Section 5.6. + + + Enumeration of the possible object types. + + + No object type. + + + An array. + + + A boolean value. + + + An integer value. + + + A null. + + + An number value. + + + An object. + + + A string. + + + A file (used in Swagger specifications). + + + Utilities to work with JSON paths. + + + Gets the JSON path of the given object. + The root object. + The object to search. + The path or null when the object could not be found. + Could not find the JSON path of a child object. + is + + + Gets the JSON path of the given object. + The root object. + The object to search. + The contract resolver. + The path or null when the object could not be found. + Could not find the JSON path of a child object. + is + + + Gets the JSON path of the given object. + The root object. + The objects to search. + The contract resolver. + The path or null when the object could not be found. + Could not find the JSON path of a child object. + is + + + Resolves JSON Pointer references. + + + Initializes a new instance of the class. + The schema appender. + + + Creates the factory to be used in the FromJsonAsync method. + The type name generator. + The factory. + + + Adds a document reference. + The document path. + The referenced schema. + + + Gets the object from the given JSON path. + The root object. + The JSON path. + The target type to resolve. + The contract resolver. + The cancellation token + The JSON Schema or null when the object could not be found. + Could not resolve the JSON path. + Could not resolve the JSON path. + + + Gets the object from the given JSON path. + The root object. + The JSON path. + The target type to resolve. + The contract resolver. + The cancellation token + The JSON Schema or null when the object could not be found. + Could not resolve the JSON path. + Could not resolve the JSON path. + + + Resolves a document reference. + The root object. + The JSON path to resolve. + The target type to resolve. + The contract resolver. + The resolved JSON Schema. + Could not resolve the JSON path. + + + Resolves a file reference. + The file path. + The cancellation token + The resolved JSON Schema. + The System.IO.File API is not available on this platform. + + + Resolves an URL reference. + The URL. + The cancellation token + The HttpClient.GetAsync API is not available on this platform. + + + Resolves file path. + The document path. + The JSON path + + + A base class for describing a JSON schema. + + + Initializes a new instance of the class. + + + Creates a schema which matches any data. + The any schema. + + + Creates a schema which matches any data. + The any schema. + + + Gets the NJsonSchema toolchain version. + + + Creates a from a given type. + The type to create the schema for. + The . + + + Creates a from a given type. + The type to create the schema for. + The . + + + Creates a from a given type. + The type to create the schema for. + The settings. + The . + + + Creates a from a given type. + The type to create the schema for. + The settings. + The . + + + Creates a from sample JSON data. + The JSON Schema. + + + Creates a from sample JSON data. + The JSON Schema. + + + Loads a JSON Schema from a given file path (only available in .NET 4.x). + The file path. + Cancellation token instance + The JSON Schema. + + + Loads a JSON Schema from a given file path (only available in .NET 4.x). + The file path. + The JSON reference resolver factory. + The cancellation token + The JSON Schema. + The System.IO.File API is not available on this platform. + + + Loads a JSON Schema from a given URL (only available in .NET 4.x). + The URL to the document. + The cancellation token + The JSON Schema. + The HttpClient.GetAsync API is not available on this platform. + + + Loads a JSON Schema from a given URL (only available in .NET 4.x). + The URL to the document. + The JSON reference resolver factory. + The cancellation token + The JSON Schema. + The HttpClient.GetAsync API is not available on this platform. + + + Deserializes a JSON string to a . + The JSON string. + The cancellation token + The JSON Schema. + + + Deserializes a JSON stream to a . + The JSON data stream. + The cancellation token + The JSON Schema. + + + Deserializes a JSON string to a . + The JSON string. + The document path (URL or file path) for resolving relative document references. + The cancellation token + The JSON Schema. + + + Deserializes a JSON string to a . + The JSON string. + The document path (URL or file path) for resolving relative document references. + The JSON reference resolver factory. + The cancellation token + The JSON Schema. + + + Deserializes a JSON string to a . + The JSON data stream. + The document path (URL or file path) for resolving relative document references. + The JSON reference resolver factory. + The cancellation token + The JSON Schema. + + + Gets a value indicating whether the schema is binary (file or binary format). + + + Gets the inherited/parent schema (most probable base schema in allOf). + Used for code generation. + + + Gets the inherited/parent schema which may also be inlined + (the schema itself if it is a dictionary or array, otherwise ). + Used for code generation. + + + Gets the list of all inherited/parent schemas. + Used for code generation. + + + Determines whether the given schema is the parent schema of this schema (i.e. super/base class). + The possible subtype schema. + true or false + + + Gets the discriminator or discriminator of an inherited schema (or null). + + + + Calculates whether has elements without incurring collection building + performance cost. + + + + Gets all properties of this schema (i.e. all direct properties and properties from the schemas in allOf which do not have a type). + Used for code generation. + Some properties are defined multiple times. + + + Gets or sets the schema. + + + Gets or sets the id. + + + Gets or sets the title. + + + Gets a value indicating whether the schema title can be used as type name. + + + Gets or sets the description. + + + Gets the object types (as enum flags). + + + Gets the parent schema of this schema. + + + Gets the parent schema of this schema. + + + Gets or sets the format string. + + + Gets or sets the default value. + + + Gets or sets the required multiple of for the number value. + + + Gets or sets the maximum allowed value. + + + Gets or sets the exclusive maximum value (v6). + + + Gets or sets a value indicating whether the minimum value is excluded (v4). + + + Gets or sets the minimum allowed value. + + + Gets or sets the exclusive minimum value (v6). + + + Gets or sets a value indicating whether the minimum value is excluded (v4). + + + Gets or sets the maximum length of the value string. + + + Gets or sets the minimum length of the value string. + + + Gets or sets the validation pattern as regular expression. + + + Gets or sets the maximum length of the array. + + + Gets or sets the minimum length of the array. + + + Gets or sets a value indicating whether the items in the array must be unique. + + + Gets or sets the maximal number of allowed properties in an object. + + + Gets or sets the minimal number of allowed properties in an object. + + + Gets or sets a value indicating whether the schema is deprecated (native in Open API 'deprecated', custom in Swagger/JSON Schema 'x-deprecated'). + + + Gets or sets a message indicating why the schema is deprecated (custom extension, sets 'x-deprecatedMessage'). + + + Gets or sets a value indicating whether the type is abstract, i.e. cannot be instantiated directly (x-abstract). + + + Gets or sets a value indicating whether the schema is nullable (native in Open API 'nullable', custom in Swagger 'x-nullable'). + + + Gets or sets the example (Swagger and Open API only). + + + Gets or sets a value indicating this is an bit flag enum (custom extension, sets 'x-enumFlags', default: false). + + + Gets the collection of required properties. + + + Gets a value indicating whether this is enumeration. + + + Gets the collection of required properties. + This collection can also be changed through the property. > + + + Gets or sets the dictionary key schema (x-dictionaryKey, only enum schemas are allowed). + + + Gets the properties of the type. + + + Gets the xml object of the schema (used in Swagger specifications). + + + Gets the pattern properties of the type. + + + Gets or sets the schema of an array item. + + + Gets or sets the schemas of the array's tuple values. + + + Gets or sets the schema which must not be valid. + + + Gets the other schema definitions of this schema. + + + Gets the collection of schemas where each schema must be valid. + + + Gets the collection of schemas where at least one must be valid. + + + Gets the collection of schemas where exactly one must be valid. + + + Gets or sets a value indicating whether additional items are allowed (default: true). + If this property is set to false, then is set to null. + + + Gets or sets the schema for the additional items. + If this property has a schema, then is set to true. + + + Gets or sets a value indicating whether additional properties are allowed (default: true). + If this property is set to false, then is set to null. + + + Gets or sets the schema for the additional properties. + If this property has a schema, then is set to true. + + + Gets a value indicating whether the schema describes an object. + + + Gets a value indicating whether the schema represents an array type (an array where each item has the same type). + + + Gets a value indicating whether the schema represents an tuple type (an array where each item may have a different type). + + + Gets a value indicating whether the schema represents a dictionary type (no properties and AdditionalProperties or PatternProperties contain a schema). + + + Gets a value indicating whether this is any type (e.g. any in TypeScript or object in CSharp). + + + Gets a value indicating whether the validated data can be null. + The schema type. + true if the type can be null. + + + Serializes the to a JSON string. + The JSON string. + + + Serializes the to a JSON string. + The formatting. + The JSON string. + + + Creates a from sample JSON data. + The JSON Schema. + + + Gets a value indicating whether this schema inherits from the given parent schema. + The parent schema. + true or false. + + + Validates the given JSON data against this schema. + The JSON data to validate. + The validator settings. + Could not deserialize the JSON data. + The collection of validation errors. + + + Validates the given JSON token against this schema. + The token to validate. + The validator settings. + The collection of validation errors. + + + Validates the given JSON data against this schema. + The JSON data to validate. + The type of the schema. + The validator settings. + Could not deserialize the JSON data. + The collection of validation errors. + + + Validates the given JSON token against this schema. + The token to validate. + The type of the schema. + The validator settings. + The collection of validation errors. + + + Gets the actual schema, either this or the referenced schema. + Cyclic references detected. + The schema reference path has not been resolved. + + + Gets the type actual schema (e.g. the shared schema of a property, parameter, etc.). + Cyclic references detected. + The schema reference path has not been resolved. + + + Gets a value indicating whether this is a schema reference ($ref, , or ). + + + Gets a value indicating whether this is an allOf schema reference. + + + Gets a value indicating whether this is an oneOf schema reference. + + + Gets a value indicating whether this is an anyOf schema reference. + + + Cyclic references detected. + The schema reference path has not been resolved. + + + Gets the actual referenced object, either this or the reference object. + + + Gets the parent object of this object. + + + Gets or sets the referenced object. + + + Creates the serializer contract resolver based on the . + The schema type. + The settings. + + + Gets or sets the extension data (i.e. additional properties which are not directly defined by JSON Schema). + + + Gets the discriminator property (Swagger only). + + + Gets or sets the discriminator property (Swagger only, should not be used in internal tooling). + + + Gets the actual resolved discriminator of this schema (no inheritance, OpenApi only). + + + Gets or sets the discriminator of this schema (OpenApi only). + + + Gets or sets the discriminator. + + + Gets or sets the enumeration names (optional, draft v5). + + + Gets or sets a value indicating whether the maximum value is excluded. + + + Gets or sets a value indicating whether the minimum value is excluded. + + + Gets or sets the enumeration names (optional, draft v5). + + + Appends a schema to a document (i.e. another schema). + + + Initializes a new instance of the class. + The root schema. + The type name generator. + + + Gets the root object. + + + Gets the root schema. + + + Appends the schema to the root object. + The schema to append. + The type name hint. + is + The root schema cannot be appended. + + + A description of a JSON property of a JSON schema. + + + Gets or sets the name of the property. + + + Gets the parent schema of this property schema. + + + Gets or sets a value indicating whether the property is required. + + + Value used to set property even if parent is not set yet. + + + Gets or sets a value indicating whether the property is read only (Swagger and Open API only). + + + Gets or sets a value indicating whether the property is write only (Open API only). + + + Gets a value indicating whether the property is an inheritance discriminator. + + + Determines whether the specified property null handling is nullable. + The schema type. + true if the type can be null. + + + Provides utilities to resolve and set JSON schema references. + + + Updates all properties from the + available properties. + The JSON document resolver. + The root object. + The cancellation token + + + Updates all properties from the + available properties. + The JSON document resolver. + The root object. + The contract resolver. + The cancellation token + + + Updates the properties + from the available properties with inlining external references. + The root object. + + + Updates the properties + from the available properties. + The root object. + Specifies whether to remove external references (otherwise they are inlined). + The contract resolver. + + + A description of a JSON property of a JSON object (used in Swagger specifications). + + + Gets the parent schema of the XML object schema. + + + Gets or sets the name of the xml object. + + + Gets or sets if the array elements are going to be wrapped or not. + + + Gets or sets the URL of the namespace definition. + + + Gets or sets the prefix for the name. + + + Gets or sets if the property definition translates into an attribute instead of an element. + + + Describes a schema discriminator. + + + Gets or sets the discriminator property name. + + + Gets or sets the discriminator mappings. + + + The currently used . + + + Adds a discriminator mapping for the given type and schema based on the used . + The type. + The schema. + + + + Used to convert from Dictionary{string, JsonSchema4} (NJsonSchema model) to Dictionary{string, string} (OpenAPI). + See https://github.com/OAI/OpenAPI-Specification/blob/master/versions/3.0.2.md#discriminator-object and + issue https://github.com/RicoSuter/NSwag/issues/1684 + + + + A JSON object which may reference other objects with $ref. + The methods should be implemented explicitly to hide them from the API. + + + Gets the actual referenced object, either this or the reference object. + + + Gets the parent object which may be the root. + + + A JSON object which may reference other objects with $ref. + + + Gets or sets the type reference path ($ref). + + + Gets or sets the referenced object. + + + A base class which may reference another object. + The referenced object type. + + + Gets the document path (URI or file path) for resolving relative references. + + + Gets or sets the type reference path ($ref). + + + Gets or sets the referenced object. + + + Gets or sets the referenced object. + + + Extensions to work with . + + + Finds the root parent of this schema. + The parent schema or this when this is the root. + + + Defines how to express the nullability of a property. + + + Uses oneOf with null schema and null type to express the nullability of a property (valid JSON Schema draft v4). + + + Uses required to express the nullability of a property (not valid in JSON Schema draft v4). + + + Uses null handling of Open API 3. + + + A subschema validation error. + + + Initializes a new instance of the class. + The error kind. + The property name. + The property path. + The error list. + The token that failed to validate. + The schema that contains the validation rule. + + + Gets the errors for each validated subschema. + + + Returns a string that represents the current object. + A string that represents the current object. + 2 + + + Validator for "Base64" format. + + + Gets the format attribute's value. + + + Gets the kind of error produced by validator. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Validator for "Byte" format. + + + Gets the format attribute's value. + + + Gets the kind of error produced by validator. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Validator for "Date" format. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Gets the format attribute's value. + + + Returns validation error kind. + + + Validator for DateTime format. + + + Gets the format attribute's value. + + + Gets the validation error kind. + + + Validates if a string is valid DateTime. + String value. + Type of token holding the value. + + + + Validator for "Email" format. + + + Gets the format attribute's value. + + + Gets the kind of error produced by validator. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Validator for "Guid" format. + + + Gets the format attribute's value. + + + Gets the kind of error produced by validator. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Validator for "Hostname" format. + + + Gets the format attribute's value. + + + Gets the kind of error produced by validator. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Provides a method to verify if value is of valid format. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Gets the kind of error produced by validator. + + + Gets the format attribute's value. + + + Validator for "IpV4" format. + + + Gets the format attribute's value. + + + Gets the kind of error produced by validator. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Validator for "IpV6" format. + + + Gets the format attribute's value. + + + Gets the kind of error produced by validator. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Validator for "Time" format. + + + Gets the format attribute's value. + + + Gets the kind of error produced by validator. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Validator for "TimeSpan" format. + + + Gets the format attribute's value. + + + Gets the kind of error produced by validator. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Validator for "Uri" format. + + + Gets the format attribute's value. + + + Gets the kind of error produced by validator. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Validator for "Uuid" format. + + + Gets the format attribute's value. + + + Gets the kind of error produced by validator. + + + Validates format of given value. + String value. + Type of token holding the value. + True if value is correct for given format, False - if not. + + + Class to validate a JSON schema against a given . + + + + Initializes JsonSchemaValidator + + + + + Initializes JsonSchemaValidator + + + + Validates the given JSON data. + The json data. + The schema. + The type of the schema. + Could not deserialize the JSON data. + The list of validation errors. + + + Validates the given JSON token. + The token. + The schema. + The type of the schema. + The list of validation errors. + + + Validates the given JSON token. + The token. + The schema. + The type of the schema. + The current property name. + The current property path. + The list of validation errors. + + + Class to configure the behavior of . + + + Gets or sets the used to compare object properties. + + + Gets or sets the format validators. + + + + Adds a custom format validator to the array. + + The format validator. + + + A multi type validation error. + + + Initializes a new instance of the class. + The error kind. + The property name. + The property path. + The error list. + The token that failed to validate. + The schema that contains the validation rule. + + + Gets the errors for each validated type. + + + Returns a string that represents the current object. + A string that represents the current object. + 2 + + + A validation error. + + + Initializes a new instance of the class. + The error kind. + The property name. + The property path. + The token that failed to validate. + The schema that contains the validation rule. + + + Gets the error kind. + + + Gets the property name. + + + Gets the property path. + + + Indicates whether or not the error contains line information. + + + Gets the line number the validation failed on. + + + Gets the line position the validation failed on. + + + Gets the schema element that contains the validation rule. + + + Returns a string that represents the current object. + A string that represents the current object. + 2 + + + Enumeration of the possible error kinds. + + + An unknown error. + + + A string is expected. + + + A number is expected. + + + An integer is expected. + + + A boolean is expected. + + + An object is expected. + + + The property is required but not found. + + + An array is expected. + + + An array is expected. + + + The Regex pattern does not match. + + + The string is too short. + + + The string is too long. + + + The number is too small. + + + The number is too big. + + + The integer is too big. + + + The array contains too many items. + + + The array contains too few items. + + + The items in the array are not unique. + + + A date time is expected. + + + A date is expected. + + + A time is expected. + + + A time-span is expected. + + + An URI is expected. + + + An IP v4 address is expected. + + + An IP v6 address is expected. + + + A valid GUID is expected. + + + The object is not any of the given schemas. + + + The object is not all of the given schemas. + + + The object is not one of the given schemas. + + + The object matches the not allowed schema. + + + The number is not a multiple of the given number. + + + The integer is not a multiple of the given integer. + + + The value is not one of the allowed enumerations. + + + An Email is expected. + + + An hostname is expected. + + + The array tuple contains too many items. + + + An array item is not valid. + + + The item is not valid with the AdditionalItems schema. + + + The additional properties are not valid. + + + Additional/unspecified properties are not allowed. + + + There are too many properties in the object. + + + There are too few properties in the tuple. + + + A Base64 string is expected. + + + No type of the types does validate (check error details in ). + + + A valid UUID is expected. + + + Visitor to transform an object with objects. + + + Initializes a new instance of the class. + + + Initializes a new instance of the class. + The contract resolver. + + + Processes an object. + The object to process. + The task. + + + Processes an object. + The object to process. + Cancellation token instance + The task. + + + Called when a is visited. + The visited schema. + The path. + The type name hint. + The cancellation token + The task. + + + Processes an object. + The object to process. + The path + The type name hint. + The checked objects. + The replacer. + The cancellation token + The task. + + + Visitor to transform an object with objects. + + + Called when a is visited. + The visited schema. + The path. + The type name hint. + The cancellation token + The task. + + + Called when a is visited. + The visited schema. + The path. + The type name hint. + The cancellation token + The task. + + + Visitor to transform an object with objects. + + + Initializes a new instance of the class. + + + Initializes a new instance of the class. + The contract resolver. + + + Processes an object. + The object to process. + The task. + + + Called when a is visited. + The visited schema. + The path. + The type name hint. + The task. + + + Processes an object. + The object to process. + The path + The type name hint. + The checked objects. + The replacer. + The task. + + + Visitor to transform an object with objects. + + + Called when a is visited. + The visited schema. + The path. + The type name hint. + The task. + + + Called when a is visited. + The visited schema. + The path. + The type name hint. + The task. + + + diff --git a/Plugins/Managed/NJsonSchema.xml.meta b/Plugins/Managed/NJsonSchema.xml.meta new file mode 100644 index 0000000..7560a54 --- /dev/null +++ b/Plugins/Managed/NJsonSchema.xml.meta @@ -0,0 +1,7 @@ +fileFormatVersion: 2 +guid: 244685d3804d9479eb7dafabf4e40a7a +TextScriptImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Managed/Namotion.Reflection.dll b/Plugins/Managed/Namotion.Reflection.dll new file mode 100644 index 0000000..8242003 Binary files /dev/null and b/Plugins/Managed/Namotion.Reflection.dll differ diff --git a/Plugins/Managed/Namotion.Reflection.dll.meta b/Plugins/Managed/Namotion.Reflection.dll.meta new file mode 100644 index 0000000..d631710 --- /dev/null +++ b/Plugins/Managed/Namotion.Reflection.dll.meta @@ -0,0 +1,33 @@ +fileFormatVersion: 2 +guid: 1f5a35e4767fa4eccac6a73be563efa5 +PluginImporter: + externalObjects: {} + serializedVersion: 2 + iconMap: {} + executionOrder: {} + defineConstraints: [] + isPreloaded: 0 + isOverridable: 1 + isExplicitlyReferenced: 0 + validateReferences: 1 + platformData: + - first: + Any: + second: + enabled: 1 + settings: {} + - first: + Editor: Editor + second: + enabled: 0 + settings: + DefaultValueInitialized: true + - first: + Windows Store Apps: WindowsStoreApps + second: + enabled: 0 + settings: + CPU: AnyCPU + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Managed/Namotion.Reflection.xml b/Plugins/Managed/Namotion.Reflection.xml new file mode 100644 index 0000000..a51d3d6 --- /dev/null +++ b/Plugins/Managed/Namotion.Reflection.xml @@ -0,0 +1,1091 @@ + + + + Namotion.Reflection + + + + + A cached type object without context. + + + + + Clears the cache. + + + + + Internal generic arguments. + + + + + Internal original generic arguments. + + + + + Internal element type. + + + + + Unwraps the OriginalType as from the context type. + + The contextual type + + + + Gets the original type (i.e. without unwrapping Nullable{T}). + + + + + Gets all type attributes. + + + + + Gets the type name. + + + + + Gest the original's type info. + + + + + Gets the type's associated attributes of the type (inherited). + + + + + Gets the actual unwrapped type (e.g. gets T of a Nullable{T} type). + + + + + Gets a value indicating whether this type is wrapped with Nullable{T}. + + + + + Gets the type's generic arguments (Nullable{T} is unwrapped). + + + + + Gets the type's original generic arguments (Nullable{T} is not unwrapped). + + + + + Gets the type's element type (i.e. array type). + + + + + Gets an attribute of the given type which is defined on the type. + + The attribute type. + The attribute or null. + + + + Gets the attributes of the given type which are defined on the type. + + The attribute type. + The attributes. + + + + + + Gets the cached type for the given type and nullable flags index. + The type. + The flags. + The cached type. + + + + Updates the original generic arguments. + + + + + Updates the original generic arguments. + + + + Base class for a contextual property or field. + + + + Gets the accessor's type. + + + + + Gets the nullability information of this accessor's type in the given context by unwrapping Nullable{T}. + + + + + Gets the accessor's contextual attributes (e.g. attributes on property or field). + + + + + Returns the value of a field supported by a given object. + + The object. + The value. + + + + Sets the value of the field supported by the given object. + + The object. + The value. + + + + Gets an attribute of the given type which is defined on the context (property, field, parameter or contextual generic argument type). + + The attribute type. + The attribute or null. + + + + Gets the attributes of the given type which are defined on the context (property, field, parameter or contextual generic argument type). + + The attribute type. + The attributes. + + + + A field info with contextual information. + + + + + Gets the type context's field info. + + + + + Gets the type context's member info. + + + + + + + + Gets the field's contextual type. + + + + + Gets the cached field name. + + + + + Returns the value of a field supported by a given object. + + The object. + The value. + + + + Sets the value of the field supported by the given object. + + The object. + The value. + + + + A member info with contextual information. + + + + + Gets the type context's member info. + + + + + Gets the name of the cached member name (property or parameter name). + + + + + + + + A method info with contextual information. + + + + + Gets the type context's method info. + + + + + Gets the name of the cached method name. + + + + + Gets the contextual parameters. + + + + + Gets the contextual return parameter. + + + + + + + + + + + A parameter info with contextual information. + + + + + Gets the type context's parameter info. + + + + + Gets the cached parameter name. + + + + + + + + A property info with contextual information. + + + + + Gets the type context's property info. + + + + + + + + Gets the properties contextual type. + + + + + Gets the cached field name. + + + + + Gets the type context's member info. + + + + + Gets a value indicating whether the property can be written to. + + + + + Gets a value indicating whether the property can be read from. + + + + + Returns the value of a field supported by a given object. + + The object. + The value. + + + + Sets the value of the field supported by the given object. + + The object. + The value. + + + + A cached type with context information (e.g. parameter, field, property with nullability). + + + + + Gets the parent type with context. + + + + + Gets the type's associated attributes of the given context (inherited). + + + + + Gets the original nullability information of this type in the given context (i.e. without unwrapping Nullable{T}). + + + + + Gets all contextual and type attributes (in this order). + + + + + Gets the generic type arguments of the type in the given context (empty when unwrapped from Nullable{T}). + + + + + Gets the original generic type arguments of the type in the given context. + + + + + Gets the type's element type (i.e. array type). + + + + + Gets the type's element type (i.e. array type). + + + + + Gets the type's base type + + + + + Gets the nullability information of this type in the given context by unwrapping Nullable{T} into account. + + + + + Gets a value indicating whether the System.Type is a value type. + + + + + Gets an attribute of the given type which is defined on the context (property, field, parameter or contextual generic argument type). + + The attribute type. + The attribute or null. + + + + Gets the attributes of the given type which are defined on the context (property, field, parameter or contextual generic argument type). + + The attribute type. + The attributes. + + + + Gets an attribute of the given type which is defined on the context or on the type. + + The attribute type. + The attribute or null. + + + + Gets the attributes of the given type which are defined on the context or on the type. + + The attribute type. + The attributes. + + + + Gets the contextual properties of this type. + + + + + Gets the contextual methods of this type (runtime). + + + + + Gets the contextual properties of this type. + + + + + Gets a contextual property of the given contextual type (preserving the context). + + The property name. + The contextual property or null. + + + + Gets a contextual field of the given contextual type (preserving the context). + + The field name. + The contextual field or null. + + + + + + Gets the cached type for the given type and nullable flags index. + The type. + The flags. + The cached type. + + + + Helper to support old runtimes. + + + + + Type and member extension methods to extract contextual or cached types. + + + + + Gets a for the given instance. + + The type. + The . + + + + Gets a for the given instance. + + The type. + The . + + + + Gets an enumerable of s (all properties and fields) for the given instance. + + The type. + + + + + Gets an array of for the given instance. + + The type. + The runtime properties. + + + + Gets an array of for the given instance. + + The type. + The runtime fields. + + + + Gets an uncached for the given instance and attributes. + + The type. + The attributes. + The . + + + + Gets a for the given instance. + Warning: Retrieving contextual information directly from might lose original context data (NRT on original generic type parameters). + + The parameter info. + The . + + + + Gets a for the given instance. + Warning: Retrieving contextual information directly from might lose original context data (NRT on original generic type parameters). + + The property info. + The . + + + + Gets a for the given instance. + Warning: Retrieving contextual information directly from might lose original context data (NRT on original generic type parameters). + + The field info. + The . + + + + Gets a for the given instance. + Warning: Retrieving contextual information directly from might lose original context data (NRT on original generic type parameters). + + The member info. + The . + + + + Specifies the nullability in the given context. + + + + + Nullability is unknown (NRT is not enabled). + + + + + Reference type is not nullable. + + + + + Reference type can be null. + + + + + IEnumerable extensions. + + + + Tries to get the first object which is assignable to the given type name. + The objects. + Type of the attribute. + The type name style. + The objects which are assignable. + + + Tries to get the first object which is assignable to the given type name. + The objects. + Type of the attribute. + The type name style. + May return null (not found). + + + Finds the first common base type of the given types. + The types. + The common base type. + + + Provides dynamic access to framework APIs. + + + Gets a value indicating whether file APIs are available. + + + Gets a value indicating whether path APIs are available. + + + Gets a value indicating whether path APIs are available. + + + Gets a value indicating whether XPath APIs are available. + + + Gets the current working directory. + The directory path. + The System.IO.Directory API is not available on this platform. + + + Checks whether a file exists. + The file path. + true or false + The System.IO.File API is not available on this platform. + + + Read a file. + The file path. + The file content. + The System.IO.File API is not available on this platform. + + + Checks whether a directory exists. + The directory path. + true or false + The System.IO.File API is not available on this platform. + + + Gets all files of directory and its sub-directories. + The directory path. + The search pattern. + true or false + The System.IO.Directory API is not available on this platform. + + + Gets all files of directory. + The directory path. + The search pattern. + true or false + The System.IO.Directory API is not available on this platform. + + + Combines two paths. + The path1. + The path2. + The combined path. + The System.IO.Path API is not available on this platform. + + + Gets the directory path of a file path. + The file path. + The directory name. + The System.IO.Path API is not available on this platform. + + + Evaluates the XPath for a given XML document. + The document. + The path. + The value. + The System.Xml.XPath.Extensions API is not available on this platform. + + + + Object extensions. + + + + Determines whether the specified property name exists. + The object. + Name of the property. + true if the property exists; otherwise, false. + + + Determines whether the specified property name exists. + The object. + Name of the property. + The default value if the property does not exist. + The property or the default value. + + + + Gets or sets a value indicating whether to disable nullability validation completely (global). + + + + Checks whether the object has valid non nullable properties. + The object. + Specifies whether to also recursively check children. + The result. + + + Checks whether the object has valid non nullable properties. + The object. + Specifies whether to also recursively check children. + The result. + + + Checks which non nullable properties are null (invalid). + The object. + Specifies whether to also recursively check children. + The result. + + + + Caching layer hiding the details of accessing DLL documentation. + + + + + Contains extension for . + + + + + Allows to append multiple strings to the at once. + + + Only strings that are neither null nor string.Empty will be added. + + The instance of . + The values to appends. + The value of . + + + + Allows to append multiple strings to the at once. + + + Only strings that are neither null nor string.Empty will be added. + + The instance of . + First value to append. + Second value to append. + Third value to append. (optional) + Fourth value to append. (optional) + Fifth value to append. (optional) + Sixth value to append. (optional) + The value of . + + + Provides extension methods for reflection. + + + Checks whether the given type is assignable to the given type name. + The type. + Name of the type. + The type name style. + + + + Checks whether the given type is assignable to the given type name. + The type. + Name of the type. + The type name style. + + + + Checks whether the given type inherits from the given type name. + The type. + Name of the type. + The type name style. + true if the type inherits from typeName. + + + Gets the type of the array item. + + + Gets the generic type arguments of a type or its base type. + The type. + The type arguments. + + + Gets a human readable type name (e.g. DictionaryOfStringAndObject). + The type. + The type name. + + + The type name style. + + + Only the name of the type. + + + The full name of the type including the namespace. + + + Provides extension methods for reading XML comments from reflected members. + + + + Clears the cache. + + + + Provides extension methods for reading XML comments from reflected members. + + + Returns the contents of the "summary" XML documentation tag for the specified member. + The type. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of the "remarks" XML documentation tag for the specified member. + The type. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of an XML documentation tag for the specified member. + The type. + Name of the tag. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of the "summary" XML documentation tag for the specified member. + The reflected member. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of the "remarks" XML documentation tag for the specified member. + The reflected member. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of an XML documentation tag for the specified member. + The reflected member. + Name of the tag. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of the "returns" or "param" XML documentation tag for the specified parameter. + The reflected parameter or return info. + The XML docs reading and formatting options. + The contents of the "returns" or "param" tag. + + + Returns the contents of the "summary" XML documentation tag for the specified member. + The type. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of the "remarks" XML documentation tag for the specified member. + The type. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of an XML documentation tag for the specified member. + The type. + Name of the tag. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of the "summary" XML documentation tag for the specified member. + The reflected member. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of the "remarks" XML documentation tag for the specified member. + The reflected member. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of an XML documentation tag for the specified member. + The reflected member. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of the "summary" XML documentation tag for the specified member. + The reflected member. + The path to the XML documentation file. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the contents of an XML documentation tag for the specified member. + The reflected member. + Name of the tag. + The XML docs reading and formatting options. + The contents of the "summary" tag for the member. + + + Returns the property summary of a Record type which is read from the param tag on the type. + The reflected member. + The XML docs reading and formatting options. + The contents of the "param" tag of the Record property. + + + Returns the contents of the "returns" or "param" XML documentation tag for the specified parameter. + The reflected parameter or return info. + The XML docs reading and formatting options. + The contents of the "returns" or "param" tag. + + + Returns the contents of the "returns" or "param" XML documentation tag for the specified parameter. + The reflected parameter or return info. + The path to the XML documentation file. + The XML docs reading and formatting options. + The contents of the "returns" or "param" tag. + + + Converts the given XML documentation to text. + The XML element. + The XML docs reading and formatting options. + The text + + + Unknown member type. + + + + Gets the file path to the XML docs for the given assembly. + + The assembly. + The XML docs reading and formatting options. + The file path or null if not found. + + + + Get Type from a referencing string such as !:MyType or !:MyType.MyProperty + + + + + Contains the logic to maintain formatting of XML-doc elements. + + + + + Appends the value of to the respecting + to generate additional formatting information. + + The current . + The to append. + The for generating additional formatting tags. + The passed in . + + + + Appends the value of without any formatting information. + + The current . + The to append. + The value of . + + + + Appends the value of surrounded by the neccessary HTML-tags to maintain its formatting information (if supported). + + The current . + The to append. + The value of . + + + + Appends the value of surrounded by the neccessary Markdown-tags to maintain its formatting information (if supported). + + The current . + The to append. + The value of . + + + + Appends the value of surrounded by the tags specified in (if supported). + + The current . + The to append. + Map of formatting tags. + The value of . + + + + Appends the value of surrounded by and + to maintain its formatting information (if supported). + + The current . + The to append. + The start-tag. + The end-tag. + The value of . + + + + Contains the formatting modes supported. + + + + + Doesn't use any formatting. + + + + + Maintains formatting through HTML-tags. + + + + + Maintains formatting through Markdown-tags. + + + + + Contains constants for element and attribute names use in XML-docs. + + + + + Name of the summary element. + + + + + Name of the remarks element. + + + + + Name of the param element. + + + + + Name of the name attribute used in conjunction with the . + + + + + Name of the paramref element. + + + + + Name of the name attribute used in conjunction with the . + + + + + Name of the see element. + + + + + Name of the langword attribute used in conjunction with the . + + + + + Name of the cref attribute used in conjunction with the . + + + + + Name of the href attribute used in conjunction with the . + + + + + Name of the returns element. + + + + + Name of the inheritdoc element. + + + + + Contains all options to control generation of XML-docs. + + + + + The default options. + + + + + Specifies whether tho resolve the XML Docs from the NuGet cache or .NET SDK directory. + + + + + Specifies how formatting tags should be processed. + + + + diff --git a/Plugins/Managed/Namotion.Reflection.xml.meta b/Plugins/Managed/Namotion.Reflection.xml.meta new file mode 100644 index 0000000..68a0c08 --- /dev/null +++ b/Plugins/Managed/Namotion.Reflection.xml.meta @@ -0,0 +1,7 @@ +fileFormatVersion: 2 +guid: 508aed6f690de434b868b8c8e180fb4a +TextScriptImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Web.meta b/Plugins/Web.meta new file mode 100644 index 0000000..460028a --- /dev/null +++ b/Plugins/Web.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: fd98ea3d8c21142a69778b620bb4e6d0 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Web/libVideoKit.a b/Plugins/Web/libVideoKit.a new file mode 100644 index 0000000..c7d08dc Binary files /dev/null and b/Plugins/Web/libVideoKit.a differ diff --git a/Plugins/Web/libVideoKit.a.meta b/Plugins/Web/libVideoKit.a.meta new file mode 100644 index 0000000..efb927c --- /dev/null +++ b/Plugins/Web/libVideoKit.a.meta @@ -0,0 +1,87 @@ +fileFormatVersion: 2 +guid: 1be90bdfa8b234523a45d0c70cfdd954 +PluginImporter: + externalObjects: {} + serializedVersion: 2 + iconMap: {} + executionOrder: {} + defineConstraints: [] + isPreloaded: 0 + isOverridable: 1 + isExplicitlyReferenced: 0 + validateReferences: 1 + platformData: + - first: + : Any + second: + enabled: 0 + settings: + Exclude Android: 1 + Exclude Editor: 1 + Exclude Linux64: 1 + Exclude OSXUniversal: 1 + Exclude WebGL: 0 + Exclude Win: 1 + Exclude Win64: 1 + Exclude iOS: 1 + - first: + Android: Android + second: + enabled: 0 + settings: + AndroidSharedLibraryType: Executable + CPU: ARMv7 + - first: + Any: + second: + enabled: 0 + settings: {} + - first: + Editor: Editor + second: + enabled: 0 + settings: + CPU: AnyCPU + DefaultValueInitialized: true + OS: AnyOS + - first: + Standalone: Linux64 + second: + enabled: 0 + settings: + CPU: None + - first: + Standalone: OSXUniversal + second: + enabled: 0 + settings: + CPU: None + - first: + Standalone: Win + second: + enabled: 0 + settings: + CPU: None + - first: + Standalone: Win64 + second: + enabled: 0 + settings: + CPU: None + - first: + WebGL: WebGL + second: + enabled: 1 + settings: {} + - first: + iPhone: iOS + second: + enabled: 0 + settings: + AddToEmbeddedBinaries: false + CPU: AnyCPU + CompileFlags: + FrameworkDependencies: + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Windows.meta b/Plugins/Windows.meta new file mode 100644 index 0000000..ce829d9 --- /dev/null +++ b/Plugins/Windows.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: e85103256f5354967b34956d4becf573 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/Windows/VideoKit.dll b/Plugins/Windows/VideoKit.dll new file mode 100644 index 0000000..8e0cf30 Binary files /dev/null and b/Plugins/Windows/VideoKit.dll differ diff --git a/Plugins/Windows/VideoKit.dll.meta b/Plugins/Windows/VideoKit.dll.meta new file mode 100644 index 0000000..1b4b51b --- /dev/null +++ b/Plugins/Windows/VideoKit.dll.meta @@ -0,0 +1,82 @@ +fileFormatVersion: 2 +guid: 7d661a389cbd444d7a9860684c65df7b +PluginImporter: + externalObjects: {} + serializedVersion: 2 + iconMap: {} + executionOrder: {} + defineConstraints: [] + isPreloaded: 0 + isOverridable: 1 + isExplicitlyReferenced: 0 + validateReferences: 1 + platformData: + - first: + : Any + second: + enabled: 0 + settings: + Exclude Android: 1 + Exclude Editor: 0 + Exclude Linux64: 0 + Exclude OSXUniversal: 0 + Exclude WebGL: 1 + Exclude Win: 0 + Exclude Win64: 0 + Exclude iOS: 1 + - first: + Android: Android + second: + enabled: 0 + settings: + AndroidSharedLibraryType: Executable + CPU: ARMv7 + - first: + Any: + second: + enabled: 0 + settings: {} + - first: + Editor: Editor + second: + enabled: 1 + settings: + CPU: AnyCPU + DefaultValueInitialized: true + OS: Windows + - first: + Standalone: Linux64 + second: + enabled: 1 + settings: + CPU: None + - first: + Standalone: OSXUniversal + second: + enabled: 1 + settings: + CPU: None + - first: + Standalone: Win + second: + enabled: 1 + settings: + CPU: None + - first: + Standalone: Win64 + second: + enabled: 1 + settings: + CPU: x86_64 + - first: + iPhone: iOS + second: + enabled: 0 + settings: + AddToEmbeddedBinaries: false + CPU: AnyCPU + CompileFlags: + FrameworkDependencies: + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/iOS.meta b/Plugins/iOS.meta new file mode 100644 index 0000000..f440c54 --- /dev/null +++ b/Plugins/iOS.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: f0ca55904108641e1bff9e42f7ac0f62 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/iOS/VideoKit.framework.meta b/Plugins/iOS/VideoKit.framework.meta new file mode 100644 index 0000000..68231e5 --- /dev/null +++ b/Plugins/iOS/VideoKit.framework.meta @@ -0,0 +1,81 @@ +fileFormatVersion: 2 +guid: 9be064b24aea241088782fb2873217fb +PluginImporter: + externalObjects: {} + serializedVersion: 2 + iconMap: {} + executionOrder: {} + defineConstraints: [] + isPreloaded: 0 + isOverridable: 1 + isExplicitlyReferenced: 0 + validateReferences: 1 + platformData: + - first: + : Any + second: + enabled: 0 + settings: + Exclude Android: 1 + Exclude Editor: 1 + Exclude Linux64: 1 + Exclude OSXUniversal: 1 + Exclude WebGL: 1 + Exclude Win: 1 + Exclude Win64: 1 + Exclude iOS: 0 + - first: + Android: Android + second: + enabled: 0 + settings: + CPU: ARMv7 + - first: + Any: + second: + enabled: 0 + settings: {} + - first: + Editor: Editor + second: + enabled: 0 + settings: + CPU: AnyCPU + DefaultValueInitialized: true + OS: AnyOS + - first: + Standalone: Linux64 + second: + enabled: 0 + settings: + CPU: None + - first: + Standalone: OSXUniversal + second: + enabled: 0 + settings: + CPU: None + - first: + Standalone: Win + second: + enabled: 0 + settings: + CPU: None + - first: + Standalone: Win64 + second: + enabled: 0 + settings: + CPU: None + - first: + iPhone: iOS + second: + enabled: 1 + settings: + AddToEmbeddedBinaries: true + CPU: ARM64 + CompileFlags: + FrameworkDependencies: + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/iOS/VideoKit.framework/Headers/VKTAsset.h b/Plugins/iOS/VideoKit.framework/Headers/VKTAsset.h new file mode 100644 index 0000000..a1f34c4 --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/Headers/VKTAsset.h @@ -0,0 +1,202 @@ +// +// VKTAsset.h +// VideoKit +// +// Created by Yusuf Olokoba on 7/08/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include +#include + +#pragma once + +#pragma region --Enumerations-- +/*! + @enum VKTAssetType + + @abstract Media asset type. + + @constant VKT_ASSET_TYPE_UNKNOWN + Unknown or unsupported asset type. + + @constant VKT_ASSET_TYPE_VIDEO + Media asset is a video file. + + @constant VKT_ASSET_TYPE_AUDIO + Media asset is an audio file. +*/ +enum VKTAssetType { + VKT_ASSET_TYPE_UNKNOWN = 0, + VKT_ASSET_TYPE_IMAGE = 1, + VKT_ASSET_TYPE_AUDIO = 2, + VKT_ASSET_TYPE_VIDEO = 3, +}; +typedef enum VKTAssetType VKTAssetType; +#pragma endregion + + +#pragma region --Types-- +/*! + @abstract Callback invoked with the result of loading an asset. + + @param context + User context provided to the load function. + + @param path + Asset path. + + @param type + Asset type. + + @param width + Video width. + + @param height + Video height. + + @param frameRate + Video frame rate. + + @param sampleRate + Audio sample rate. + + @param channelCount + Audio channel count. + + @param duration + Asset duration in seconds. +*/ +typedef void (*VKTAssetLoadHandler) ( + void* context, + const char* path, + VKTAssetType type, + int32_t width, + int32_t height, + float frameRate, + int32_t sampleRate, + int32_t channelCount, + float duration +); + +/*! + @abstract Callback invoked with the result of the sharing action. + + @param context + User context provided to the share function. + + @param receiver + Identifier of receiving application chosen by the user. +*/ +typedef void (*VKTAssetShareHandler) (void* context, const char* receiver); +#pragma endregion + + +#pragma region --Asset-- +/*! + @function VKTAssetLoad + + @abstract Load a media asset. + + @discussion Load a media asset and return metadata. + + @param path + Path to media asset. + + @param handler + Completion handler to be invoked with the result of the load action. + + @param context + User context passed to the load handler. Can be `NULL`. + + @returns Load status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTAssetLoad ( + const char* path, + VKTAssetLoadHandler handler, + void* context +); + +/*! + @function VKTAssetLoadFromCameraRoll + + @abstract Load a media asset from the camera roll. + + @discussion Load a media asset and return metadata. + + @param type + Media asset type. + + @param handler + Completion handler to be invoked with the result of the load action. + + @param context + User context passed to the load handler. Can be `NULL`. + + @returns Load status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTAssetLoadFromCameraRoll ( + VKTAssetType type, + VKTAssetLoadHandler handler, + void* context +); + +/*! + @function VKTAssetShare + + @abstract Share a media asset. + + @discussion Share a media asset. + + @param path + Path to media asset. + + @param message + Message to share along with the media asset. Can be `NULL`. + + @param handler + Completion handler to be invoked with the result of the sharing action. Can be `NULL`. + + @param context + User context passed to the completion handler. Can be `NULL`. + + @returns Share status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTAssetShare ( + const char* path, + const char* message, + VKTAssetShareHandler handler, + void* context +); + +/*! + @function VKTAssetSaveToCameraRoll + + @abstract Save a media asset to the camera roll. + + @discussion Save a media asset to the camera roll. + + @param path + Path to media asset. + + @param album + Name of album to save asset to. Can be `NULL`. + + @param handler + Completion handler to be invoked with the result of the save action. Can be `NULL`. + Note that the `receiver` will receive the constant string "camera_roll" on successfully saving the media asset. + + @param context + User context passed to the completion handler. Can be `NULL`. + + @returns Save status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTAssetSaveToCameraRoll ( + const char* path, + const char* album, + VKTAssetShareHandler handler, + void* context +); +#pragma endregion \ No newline at end of file diff --git a/Plugins/iOS/VideoKit.framework/Headers/VKTCamera.h b/Plugins/iOS/VideoKit.framework/Headers/VKTCamera.h new file mode 100644 index 0000000..b3d65f2 --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/Headers/VKTCamera.h @@ -0,0 +1,1177 @@ +// +// VKTCamera.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include + +#pragma region --Enumerations-- +/*! + @enum VKTExposureMode + + @abstract Camera device exposure mode. + + @constant VKT_EXPOSURE_MODE_CONTINUOUS + Continuous auto exposure. + + @constant VKT_EXPOSURE_MODE_LOCKED + Locked exposure. Exposure settings will be fixed to their current values. + Requires `VKT_CAMERA_FLAG_LOCKED_EXPOSURE` device flag. + + @constant VKT_EXPOSURE_MODE_MANUAL + Manual exposure. User will set exposure duration and sensitivity. + Requires `VKT_CAMERA_FLAG_MANUAL_EXPOSURE` device flag. +*/ +enum VKTExposureMode { + VKT_EXPOSURE_MODE_CONTINUOUS = 0, + VKT_EXPOSURE_MODE_LOCKED = 1, + VKT_EXPOSURE_MODE_MANUAL = 2 +}; +typedef enum VKTExposureMode VKTExposureMode; + +/*! + @enum VKTFlashMode + + @abstract Camera device photo flash modes. + + @constant VKT_FLASH_MODE_OFF + The flash will never be fired. + + @constant VKT_FLASH_MODE_ON + The flash will always be fired. + + @constant VKT_FLASH_MODE_AUTO + The sensor will determine whether to fire the flash. +*/ +enum VKTFlashMode { + VKT_FLASH_MODE_OFF = 0, + VKT_FLASH_MODE_ON = 1, + VKT_FLASH_MODE_AUTO = 2 +}; +typedef enum VKTFlashMode VKTFlashMode; + +/*! + @enum VKTFocusMode + + @abstract Camera device focus mode. + + @constant VKT_FOCUS_MODE_CONTINUOUS + Continuous auto focus. + + @constant VKT_FOCUS_MODE_LOCKED + Locked auto focus. Focus settings will be fixed to their current values. + Requires `VKT_CAMERA_FLAG_FOCUS_LOCK` device flag. +*/ +enum VKTFocusMode { + VKT_FOCUS_MODE_CONTINUOUS = 0, + VKT_FOCUS_MODE_LOCKED = 1, +}; +typedef enum VKTFocusMode VKTFocusMode; + +/*! + @enum VKTImageFormat + + @abstract Camera image format. + + @constant VKT_IMAGE_FORMAT_UNKNOWN + Unknown or invalid format. + + @constant VKT_IMAGE_FORMAT_YCbCr420 + YUV semi-planar format. + + @constant VKT_IMAGE_FORMAT_RGBA8888 + RGBA8888 interleaved format. + + @constant VKT_IMAGE_FORMAT_BGRA8888 + BGRA8888 interleaved format. + */ +enum VKTImageFormat { + VKT_IMAGE_FORMAT_UNKNOWN = 0, + VKT_IMAGE_FORMAT_YCbCr420 = 1, + VKT_IMAGE_FORMAT_RGBA8888 = 2, + VKT_IMAGE_FORMAT_BGRA8888 = 3, +}; +typedef enum VKTImageFormat VKTImageFormat; + +/*! + @enum VKTImageOrientation + + @abstract Camera device frame orientation. + + @constant VKT_ORIENTATION_LAVKTSCAPE_LEFT + Landscape left. + + @constant VKT_ORIENTATION_PORTRAIT + Portrait. + + @constant VKT_ORIENTATION_LAVKTSCAPE_RIGHT + Landscape right. + + @constant VKT_ORIENTATION_PORTRAIT_UPSIDE_DOWN + Portrait upside down. +*/ +enum VKTImageOrientation { + VKT_ORIENTATION_LANDSCAPE_LEFT = 3, + VKT_ORIENTATION_PORTRAIT = 1, + VKT_ORIENTATION_LANDSCAPE_RIGHT = 4, + VKT_ORIENTATION_PORTRAIT_UPSIDE_DOWN = 2 +}; +typedef enum VKTImageOrientation VKTImageOrientation; + +/*! + @enum VKTMetadata + + @abstract Sample buffer metadata key. + + @constant VKT_METADATA_INTRINSIC_MATRIX + Camera intrinsic matrix. Value array must have enough capacity for 9 float values. + + @constant VKT_METADATA_EXPOSURE_BIAS + Camera image exposure bias value in EV. + + @constant VKT_METADATA_EXPOSURE_DURATION + Camera image exposure duration in seconds. + + @constant VKT_METADATA_FOCAL_LENGTH + Camera image focal length. + + @constant VKT_METADATA_F_NUMBER + Camera image aperture F-number. + + @constant VKT_METADATA_BRIGHTNESS + Camera image ambient brightness. +*/ +enum VKTMetadata { + VKT_METADATA_INTRINSIC_MATRIX = 1, + VKT_METADATA_EXPOSURE_BIAS = 2, + VKT_METADATA_EXPOSURE_DURATION = 3, + VKT_METADATA_FOCAL_LENGTH = 4, + VKT_METADATA_F_NUMBER = 5, + VKT_METADATA_BRIGHTNESS = 6, + VKT_METADATA_ISO = 7, +}; +typedef enum VKTMetadata VKTMetadata; + +/*! + @enum VKTTorchMode + + @abstract Camera device torch mode. + + @constant VKT_TORCH_MODE_OFF + Disabled torch mode. + + @constant VKT_TORCH_MODE_MAXIMUM + Maximum torch mode. + Requires `VKT_CAMERA_FLAG_TORCH` device flag. +*/ +enum VKTTorchMode { + VKT_TORCH_MODE_OFF = 0, + VKT_TORCH_MODE_MAXIMUM = 100, +}; +typedef enum VKTTorchMode VKTTorchMode; + +/*! + @enum VKTVideoStabilizationMode + + @abstract Camera device video stabilization mode. + + @constant VKT_VIDEO_STABILIZATION_OFF + Disabled video stabilization. + + @constant VKT_VIDEO_STABILIZATION_STANDARD + Standard video stabilization + Requires `VKT_CAMERA_FLAG_VIDEO_STABILIZATION` device flag. +*/ +enum VKTVideoStabilizationMode { + VKT_VIDEO_STABILIZATION_OFF = 0, + VKT_VIDEO_STABILIZATION_STANDARD = 1, +}; +typedef enum VKTVideoStabilizationMode VKTVideoStabilizationMode; + +/*! + @enum VKTWhiteBalanceMode + + @abstract Camera device white balance mode. + + @constant VKT_WHITE_BALANCE_MODE_CONTINUOUS + Continuous auto white balance. + + @constant VKT_WHITE_BALANCE_MODE_LOCKED + Locked auto white balance. White balance settings will be fixed to their current values. + Requires `VKT_CAMERA_FLAG_WHITE_BALANCE_LOCK` device flag. +*/ +enum VKTWhiteBalanceMode { + VKT_WHITE_BALANCE_MODE_CONTINUOUS = 0, + VKT_WHITE_BALANCE_MODE_LOCKED = 1, +}; +typedef enum VKTWhiteBalanceMode VKTWhiteBalanceMode; +#pragma endregion + + +#pragma region --Types-- +/*! + @typedef VKTCamera + + @abstract Camera device. + + @discussion Camera device. +*/ +typedef VKTDevice VKTCamera; + +/*! + @typedef VKTCameraImage + + @abstract Camera image. + + @discussion Camera image. +*/ +typedef VKTSampleBuffer VKTCameraImage; +#pragma endregion + + +#pragma region --CameraDevice-- +/*! + @function VKTDiscoverCameras + + @abstract Discover available camera devices. + + @discussion Discover available camera devices. + + @param handler + Device handler. MUST NOT be `NULL`. + + @param context + Handler context. + + @returns Status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDiscoverCameras ( + VKTDeviceDiscoveryHandler handler, + void* context +); + +/*! + @function VKTCameraGetFieldOfView + + @abstract Camera field of view in degrees. + + @discussion Camera field of view in degrees. + + @param camera + Camera device. + + @param outWidth + Output FOV width in degrees. + + @param outHeight + Output FOV height in degrees. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetFieldOfView ( + VKTCamera* camera, + float* outWidth, + float* outHeight +); + +/*! + @function VKTCameraGetExposureBiasRange + + @abstract Camera exposure bias range in EV. + + @discussion Camera exposure bias range in EV. + + @param camera + Camera device. + + @param outMin + Output minimum exposure bias. + + @param outMax + Output maximum exposure bias. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetExposureBiasRange ( + VKTCamera* camera, + float* outMin, + float* outMax +); + +/*! + @function VKTCameraGetExposureDurationRange + + @abstract Camera exposure duration range in seconds. + + @discussion Camera exposure duration range in seconds. + + @param camera + Camera device. + + @param outMin + Output minimum exposure duration in seconds. + + @param outMax + Output maximum exposure duration in seconds. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetExposureDurationRange ( + VKTCamera* camera, + float* outMin, + float* outMax +); + +/*! + @function VKTCameraGetISORange + + @abstract Camera sensor sensitivity range. + + @discussion Camera sensor sensitivity range. + + @param camera + Camera device. + + @param outMin + Output minimum ISO sensitivity value. + + @param outMax + Output maximum ISO sensitivity value. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetISORange ( + VKTCamera* camera, + float* outMin, + float* outMax +); + +/*! + @function VKTCameraGetZoomRange + + @abstract Camera optical zoom range. + + @discussion Camera optical zoom range. + + @param camera + Camera device. + + @param outMin + Output minimum zoom ratio. + + @param outMax + Output maximum zoom ratio. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetZoomRange ( + VKTCamera* camera, + float* outMin, + float* outMax +); + +/*! + @function VKTCameraGetPreviewResolution + + @abstract Get the camera preview resolution. + + @discussion Get the camera preview resolution. + + @param camera + Camera device. + + @param outWidth + Output width in pixels. + + @param outHeight + Output height in pixels. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetPreviewResolution ( + VKTCamera* camera, + int32_t* outWidth, + int32_t* outHeight +); + +/*! + @function VKTCameraSetPreviewResolution + + @abstract Set the camera preview resolution. + + @discussion Set the camera preview resolution. + + Most camera devices do not support arbitrary preview resolutions, so the camera will + set a supported resolution which is closest to the requested resolution that is specified. + + Note that this method should only be called before the camera preview is started. + + @param camera + Camera device. + + @param width + Width in pixels. + + @param height + Height in pixels. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetPreviewResolution ( + VKTCamera* camera, + int32_t width, + int32_t height +); + +/*! + @function VKTCameraGetPhotoResolution + + @abstract Get the camera photo resolution. + + @discussion Get the camera photo resolution. + + @param camera + Camera device. + + @param outWidth + Output width in pixels. + + @param outHeight + Output height in pixels. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetPhotoResolution ( + VKTCamera* camera, + int32_t* outWidth, + int32_t* outHeight +); + +/*! + @function VKTCameraSetPhotoResolution + + @abstract Set the camera photo resolution. + + @discussion Set the camera photo resolution. + + Most camera devices do not support arbitrary photo resolutions, so the camera will + set a supported resolution which is closest to the requested resolution that is specified. + + Note that this method should only be called before the camera preview is started. + + @param camera + Camera device. + + @param width + Width in pixels. + + @param height + Height in pixels. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetPhotoResolution ( + VKTCamera* camera, + int32_t width, + int32_t height +); + +/*! + @function VKTCameraGetFrameRate + + @abstract Get the camera preview frame rate. + + @discussion Get the camera preview frame rate. + + @param camera + Camera device. + + @returns Camera preview frame rate. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraGetFrameRate (VKTCamera* camera); + +/*! + @function VKTCameraSetFrameRate + + @abstract Set the camera preview frame rate. + + @discussion Set the camera preview frame rate. + + Note that this method should only be called before the camera preview is started. + + @param camera + Camera device. + + @param frameRate + Frame rate to set. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetFrameRate ( + VKTCamera* camera, + int32_t frameRate +); + +/*! + @function VKTCameraGetExposureMode + + @abstract Get the camera exposure mode. + + @discussion Get the camera exposure mode. + + @param camera + Camera device. + + @returns Exposure mode. +*/ +VKT_BRIDGE VKT_EXPORT VKTExposureMode VKT_API VKTCameraGetExposureMode (VKTCamera* camera); + +/*! + @function VKTCameraSetExposureMode + + @abstract Set the camera exposure mode. + + @discussion Set the camera exposure mode. + + @param camera + Camera device. + + @param mode + Exposure mode. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetExposureMode ( + VKTCamera* camera, + VKTExposureMode mode +); + +/*! + @function VKTCameraGetExposureBias + + @abstract Get the camera exposure bias. + + @discussion Get the camera exposure bias. + + @param camera + Camera device. + + @returns Camera exposure bias. + */ +VKT_BRIDGE VKT_EXPORT float VKT_API VKTCameraGetExposureBias (VKTCamera* camera); + +/*! + @function VKTCameraSetExposureBias + + @abstract Set the camera exposure bias. + + @discussion Set the camera exposure bias. + + Note that the value MUST be in the camera exposure range. + + @param camera + Camera device. + + @param bias + Exposure bias value to set. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetExposureBias ( + VKTCamera* camera, + float bias +); + +/*! + @function VKTCameraGetExposureDuration + + @abstract Get the camera exposure duration. + + @discussion Get the camera exposure duration. + + @param camera + Camera device. + + @returns Camera exposure duration in seconds. + */ +VKT_BRIDGE VKT_EXPORT float VKT_API VKTCameraGetExposureDuration (VKTCamera* camera); + +/*! + @function VKTCameraGetISO + + @abstract Get the camera sensitivity. + + @discussion Get the camera sensitivity. + + @param camera + Camera device. + + @returns Camera sensitivity. + */ +VKT_BRIDGE VKT_EXPORT float VKT_API VKTCameraGetISO (VKTCamera* camera); + +/*! + @function VKTCameraSetExposureDuration + + @abstract Set the camera exposure duration. + + @discussion Set the camera exposure duration. + This method will automatically change the camera's exposure mode to `MANUAL`. + + @param camera + Camera device. + + @param duration + Exposure duration in seconds. MUST be in `ExposureDurationRange`. + + @param ISO + Shutter sensitivity. MUST be in `ISORange`. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetExposureDuration ( + VKTCamera* camera, + float duration, + float ISO +); + +/*! + @function VKTCameraSetExposurePoint + + @abstract Set the camera exposure point of interest. + + @discussion Set the camera exposure point of interest. + The coordinates are specified in viewport space, with each value in range [0., 1.]. + + @param camera + Camera device. + + @param x + Exposure point x-coordinate in viewport space. + + @param y + Exposure point y-coordinate in viewport space. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetExposurePoint ( + VKTCamera* camera, + float x, + float y +); + +/*! + @function VKTCameraGetFlashMode + + @abstract Get the camera photo flash mode. + + @discussion Get the camera photo flash mode. + + @param camera + Camera device. + + @returns Camera photo flash mode. + */ +VKT_BRIDGE VKT_EXPORT VKTFlashMode VKT_API VKTCameraGetFlashMode (VKTCamera* camera); + +/*! + @function VKTCameraSetFlashMode + + @abstract Set the camera photo flash mode. + + @discussion Set the camera photo flash mode. + + @param camera + Camera device. + + @param mode + Flash mode to set. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetFlashMode ( + VKTCamera* camera, + VKTFlashMode mode +); + +/*! + @function VKTCameraGetFocusMode + + @abstract Get the camera focus mode. + + @discussion Get the camera focus mode. + + @param camera + Camera device. + + @returns Camera focus mode. + */ +VKT_BRIDGE VKT_EXPORT VKTFocusMode VKT_API VKTCameraGetFocusMode (VKTCamera* camera); + +/*! + @function VKTCameraSetFocusMode + + @abstract Set the camera focus mode. + + @discussion Set the camera focus mode. + + @param camera + Camera device. + + @param mode + Focus mode. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetFocusMode ( + VKTCamera* camera, + VKTFocusMode mode +); + +/*! + @function VKTCameraSetFocusPoint + + @abstract Set the camera focus point. + + @discussion Set the camera focus point of interest. + The coordinates are specified in viewport space, with each value in range [0., 1.]. + This function should only be used if the camera supports setting the focus point. + See `VKTCameraFocusPointSupported`. + + @param camera + Camera device. + + @param x + Focus point x-coordinate in viewport space. + + @param y + Focus point y-coordinate in viewport space. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetFocusPoint ( + VKTCamera* camera, + float x, + float y +); + +/*! + @function VKTCameraGetTorchMode + + @abstract Get the current camera torch mode. + + @discussion Get the current camera torch mode. + + @param camera + Camera device. + + @returns Current camera torch mode. + */ +VKT_BRIDGE VKT_EXPORT VKTTorchMode VKT_API VKTCameraGetTorchMode (VKTCamera* camera); + +/*! + @function VKTCameraSetTorchMode + + @abstract Set the camera torch mode. + + @discussion Set the camera torch mode. + + @param camera + Camera device. + + @param mode + Torch mode. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetTorchMode ( + VKTCamera* camera, + VKTTorchMode mode +); + +/*! + @function VKTCameraGetWhiteBalanceMode + + @abstract Get the camera white balance mode. + + @discussion Get the camera white balance mode. + + @param camera + Camera device. + + @returns White balance mode. + */ +VKT_BRIDGE VKT_EXPORT VKTWhiteBalanceMode VKT_API VKTCameraGetWhiteBalanceMode (VKTCamera* camera); + +/*! + @function VKTCameraSetWhiteBalanceMode + + @abstract Set the camera white balance mode. + + @discussion Set the camera white balance mode. + + @param camera + Camera device. + + @param mode + White balance mode. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetWhiteBalanceMode ( + VKTCamera* camera, + VKTWhiteBalanceMode mode +); + +/*! + @function VKTCameraGetVideoStabilizationMode + + @abstract Get the camera video stabilization mode. + + @discussion Get the camera video stabilization mode. + + @param camera + Camera device. + + @returns Video stabilization mode. + */ +VKT_BRIDGE VKT_EXPORT VKTVideoStabilizationMode VKT_API VKTCameraGetVideoStabilizationMode (VKTCamera* camera); + +/*! + @function VKTCameraSetVideoStabilizationMode + + @abstract Set the camera video stabilization mode. + + @discussion Set the camera video stabilization mode. + + @param camera + Camera device. + + @param mode + Video stabilization mode. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetVideoStabilizationMode ( + VKTCamera* camera, + VKTVideoStabilizationMode mode +); + +/*! + @function VKTCameraGetZoomRatio + + @abstract Get the camera zoom ratio. + + @discussion Get the camera zoom ratio. + This value will always be within the minimum and maximum zoom values reported by the camera device. + + @param camera + Camera device. + + @returns Zoom ratio. + */ +VKT_BRIDGE VKT_EXPORT float VKT_API VKTCameraGetZoomRatio (VKTCamera* camera); + +/*! + @function VKTCameraSetZoomRatio + + @abstract Set the camera zoom ratio. + + @discussion Set the camera zoom ratio. + This value must always be within the minimum and maximum zoom values reported by the camera device. + + @param camera + Camera device. + + @param ratio + Zoom ratio. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetZoomRatio ( + VKTCamera* camera, + float ratio +); + +/*! + @function VKTCameraCapturePhoto + + @abstract Capture a still photo. + + @discussion Capture a still photo. + + @param camera + Camera device. + + @param handler + Photo handler. + + @param context + User-provided context. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraCapturePhoto ( + VKTCamera* camera, + VKTSampleBufferHandler handler, + void* context +); +#pragma endregion + + +#pragma region --CameraImage-- +/*! + @function VKTCameraImageGetData + + @abstract Get the image data of a camera image. + + @discussion Get the image data of a camera image. + If the camera image uses a planar format, this will return `NULL`. + + @param cameraImage + Camera image. +*/ +VKT_BRIDGE VKT_EXPORT void* VKT_API VKTCameraImageGetData (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetDataSize + + @abstract Get the image data size of a camera image in bytes. + + @discussion Get the image data size of a camera image in bytes. + + @param cameraImage + Camera image. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetDataSize (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetFormat + + @abstract Get the format of a camera image. + + @discussion Get the format of a camera image. + + @param cameraImage + Camera image. +*/ +VKT_BRIDGE VKT_EXPORT VKTImageFormat VKT_API VKTCameraImageGetFormat (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetWidth + + @abstract Get the width of a camera image. + + @discussion Get the width of a camera image. + + @param cameraImage + Camera image. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetWidth (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetHeight + + @abstract Get the height of a camera image. + + @discussion Get the height of a camera image. + + @param cameraImage + Camera image. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetHeight (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetRowStride + + @abstract Get the row stride of a camera image in bytes. + + @discussion Get the row stride of a camera image in bytes. + + @param cameraImage + Camera image. + + @returns Row stride in bytes. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetRowStride (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetTimestamp + + @abstract Get the timestamp of a camera image. + + @discussion Get the timestamp of a camera image. + + @param cameraImage + Camera image. + + @returns Image timestamp in nanoseconds. +*/ +VKT_BRIDGE VKT_EXPORT int64_t VKT_API VKTCameraImageGetTimestamp (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetVerticallyMirrored + + @abstract Whether the camera image is vertically mirrored. + + @discussion Whether the camera image is vertically mirrored. + + @param cameraImage + Camera image. +*/ +VKT_BRIDGE VKT_EXPORT bool VKT_API VKTCameraImageGetVerticallyMirrored (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetPlaneCount + + @abstract Get the plane count of a camera image. + + @discussion Get the plane count of a camera image. + If the image uses an interleaved format or only has a single plane, this function returns zero. + + @param cameraImage + Camera image. + + @returns Number of planes in image. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetPlaneCount (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetPlaneData + + @abstract Get the plane data for a given plane of a camera image. + + @discussion Get the plane data for a given plane of a camera image. + + @param cameraImage + Camera image. + + @param planeIdx + Plane index. +*/ +VKT_BRIDGE VKT_EXPORT void* VKT_API VKTCameraImageGetPlaneData ( + VKTCameraImage* cameraImage, + int32_t planeIdx +); + +/*! + @function VKTCameraImageGetPlaneDataSize + + @abstract Get the plane data size of a given plane of a camera image. + + @discussion Get the plane data size of a given plane of a camera image. + + @param cameraImage + Camera image. + + @param planeIdx + Plane index. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetPlaneDataSize ( + VKTCameraImage* cameraImage, + int32_t planeIdx +); + +/*! + @function VKTCameraImageGetPlaneWidth + + @abstract Get the width of a given plane of a camera image. + + @discussion Get the width of a given plane of a camera image. + + @param cameraImage + Camera image. + + @param planeIdx + Plane index. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetPlaneWidth ( + VKTCameraImage* cameraImage, + int32_t planeIdx +); + +/*! + @function VKTCameraImageGetPlaneHeight + + @abstract Get the height of a given plane of a camera image. + + @discussion Get the height of a given plane of a camera image. + + @param cameraImage + Camera image. + + @param planeIdx + Plane index. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetPlaneHeight ( + VKTCameraImage* cameraImage, + int32_t planeIdx +); + +/*! + @function VKTCameraImageGetPlanePixelStride + + @abstract Get the plane pixel stride for a given plane of a camera image. + + @discussion Get the plane pixel stride for a given plane of a camera image. + + @param cameraImage + Camera image. + + @param planeIdx + Plane index. + + @returns Plane pixel stride in bytes. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetPlanePixelStride ( + VKTCameraImage* cameraImage, + int32_t planeIdx +); + +/*! + @function VKTCameraImageGetPlaneRowStride + + @abstract Get the plane row stride for a given plane of a camera image. + + @discussion Get the plane row stride for a given plane of a camera image. + + @param cameraImage + Camera image. + + @param planeIdx + Plane index. + + @returns Plane row stride in bytes. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetPlaneRowStride ( + VKTCameraImage* cameraImage, + int32_t planeIdx +); + +/*! + @function VKTCameraImageGetMetadata + + @abstract Get the metadata value for a given key in a camera image. + + @discussion Get the metadata value for a given key in a camera image. + + @param cameraImage + Camera image. + + @param key + Metadata key. + + @param value + Destination value array. + + @param count + Destination value array size. + + @returns Whether the metadata key was successfully looked up. +*/ +VKT_BRIDGE VKT_EXPORT bool VKT_API VKTCameraImageGetMetadata ( + VKTCameraImage* cameraImage, + VKTMetadata key, + float* value, + int32_t count +); + +/*! + @function VKTCameraImageConvertToRGBA8888 + + @abstract Convert a camera image to an RGBA8888 pixel buffer. + + @discussion Convert a camera image to an RGBA8888 pixel buffer. + + @param cameraImage + Camera image. + + @param orientation + Desired image orientation. + + @param mirror + Whether to vertically mirror the pixel buffer. + + @param tempBuffer + Temporary pixel buffer for intermediate conversions. Pass `NULL` to use short-time allocations. + + @param dstBuffer + Destination pixel buffer. This must be at least `width * height * 4` bytes large. + + @param dstWidth + Output pixel buffer width. + + @param dstHeight + Output pixel buffer height. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraImageConvertToRGBA8888 ( + VKTCameraImage* cameraImage, + VKTImageOrientation orientation, + bool mirror, + uint8_t* tempBuffer, + uint8_t* dstBuffer, + int32_t* dstWidth, + int32_t* dstHeight +); +#pragma endregion diff --git a/Plugins/iOS/VideoKit.framework/Headers/VKTDevice.h b/Plugins/iOS/VideoKit.framework/Headers/VKTDevice.h new file mode 100644 index 0000000..6e3eb5a --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/Headers/VKTDevice.h @@ -0,0 +1,369 @@ +// +// VKTDevice.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include +#include +#include + +#pragma region --Enumerations-- +/*! + @enum VKTDeviceFlags + + @abstract Immutable properties of media devices. + + @constant VKT_DEVICE_FLAG_INTERNAL + Device is internal. + + @constant VKT_DEVICE_FLAG_EXTERNAL + Device is external. + + @constant VKT_DEVICE_FLAG_DEFAULT + Device is the default device for its media type. + + @constant VKT_MICROPHONE_FLAG_ECHO_CANCELLATION + Audio device supports echo cancellation. + + @constant VKT_CAMERA_FLAG_FRONT_FACING + Camera device is front-facing. + + @constant VKT_CAMERA_FLAG_FLASH + Camera device supports flash when capturing photos. + + @constant VKT_CAMERA_FLAG_TORCH + Camera device supports torch. + + @constant VKT_CAMERA_FLAG_DEPTH + Camera device supports depth streaming. + + @constant VKT_CAMERA_FLAG_EXPOSURE_CONTINUOUS + Camera device supports continuous auto exposure. + + @constant VKT_CAMERA_FLAG_EXPOSURE_LOCK + Camera device supports locked auto exposure. + + @constant VKT_CAMERA_FLAG_EXPOSURE_MANUAL + Camera device supports manual exposure. + + @constant VKT_CAMERA_FLAG_EXPOSURE_POINT + Camera device supports setting exposure point. + + @constant VKT_CAMERA_FLAG_FOCUS_CONTINUOUS + Camera device supports continuous auto exposure. + + @constant VKT_CAMERA_FLAG_LOCKED_FOCUS + Camera device supports locked auto focus. + + @constant VKT_CAMERA_FLAG_FOCUS_POINT + Camera device supports setting focus point. + + @constant VKT_CAMERA_FLAG_WHITE_BALANCE_CONTINUOUS + Camera device supports continuous auto white balance. + + @constant VKT_CAMERA_FLAG_WHITE_BALANCE_LOCK + Camera device supports locked auto white balance. + + @constant VKT_CAMERA_FLAG_VIDEO_STABILIZATION + Camera device supports video stabilization. +*/ +enum VKTDeviceFlags { + VKT_DEVICE_FLAG_INTERNAL = 1 << 0, + VKT_DEVICE_FLAG_EXTERNAL = 1 << 1, + VKT_DEVICE_FLAG_DEFAULT = 1 << 3, + VKT_MICROPHONE_FLAG_ECHO_CANCELLATION = 1 << 2, + VKT_CAMERA_FLAG_FRONT_FACING = 1 << 6, + VKT_CAMERA_FLAG_FLASH = 1 << 7, + VKT_CAMERA_FLAG_TORCH = 1 << 8, + VKT_CAMERA_FLAG_DEPTH = 1 << 15, + VKT_CAMERA_FLAG_EXPOSURE_CONTINUOUS = 1 << 16, + VKT_CAMERA_FLAG_EXPOSURE_LOCK = 1 << 11, + VKT_CAMERA_FLAG_EXPOSURE_MANUAL = 1 << 14, + VKT_CAMERA_FLAG_EXPOSURE_POINT = 1 << 9, + VKT_CAMERA_FLAG_FOCUS_CONTINUOUS = 1 << 17, + VKT_CAMERA_FLAG_FOCUS_LOCK = 1 << 12, + VKT_CAMERA_FLAG_FOCUS_POINT = 1 << 10, + VKT_CAMERA_FLAG_WHITE_BALANCE_CONTINUOUS = 1 << 18, + VKT_CAMERA_FLAG_WHITE_BALANCE_LOCK = 1 << 13, + VKT_CAMERA_FLAG_VIDEO_STABILIZATION = 1 << 19, +}; +typedef enum VKTDeviceFlags VKTDeviceFlags; + +/*! + @enum VKTDevicePermissionType + + @abstract Media device permission type. + + @constant VKT_DEVICE_PERMISSION_TYPE_MICROPHONE + Request microphone permissions. + + @constant VKT_DEVICE_PERMISSION_TYPE_CAMERA + Request camera permissions. +*/ +enum VKTDevicePermissionType { + VKT_DEVICE_PERMISSION_TYPE_MICROPHONE = 1, + VKT_DEVICE_PERMISSION_TYPE_CAMERA = 2, +}; +typedef enum VKTDevicePermissionType VKTDevicePermissionType; + +/*! + @enum VKTDevicePermissionStatus + + @abstract Media device permission status. + + @constant VKT_DEVICE_PERMISSION_STATUS_UNKNOWN + User has not authorized or denied access to media device. + + @constant VKT_DEVICE_PERMISSION_STATUS_DENIED + User has denied access to media device. + + @constant VKT_DEVICE_PERMISSION_STATUS_AUTHORIZED + User has authorized access to media device. +*/ +enum VKTDevicePermissionStatus { + VKT_DEVICE_PERMISSION_STATUS_UNKNOWN = 0, + VKT_DEVICE_PERMISSION_STATUS_DENIED = 2, + VKT_DEVICE_PERMISSION_STATUS_AUTHORIZED = 3, +}; +typedef enum VKTDevicePermissionStatus VKTDevicePermissionStatus; +#pragma endregion + + +#pragma region --Types-- +/*! + @struct VKTDevice + + @abstract Media device. + + @discussion Media device. +*/ +struct VKTDevice; +typedef struct VKTDevice VKTDevice; + +/*! + @struct VKTSampleBuffer + + @abstract Sample buffer generated by a media device. + + @discussion Sample buffer generated by a media device. +*/ +struct VKTSampleBuffer; +typedef struct VKTSampleBuffer VKTSampleBuffer; +#pragma endregion + + +#pragma region --Delegates-- +/*! + @abstract Callback invoked with discovered media devices. + + @param context + User-provided context. + + @param devices + Discovered devices. These MUST be released when they are no longer needed. + + @param count + Discovered device count. +*/ +typedef void (*VKTDeviceDiscoveryHandler) (void* context, VKTDevice** devices, int32_t count); + +/*! + @abstract Callback invoked with new sample buffer from a media device. + + @param context + User-provided context. + + @param sampleBuffer + Sample buffer. +*/ +typedef void (*VKTSampleBufferHandler) (void* context, VKTSampleBuffer* sampleBuffer); + +/*! + @abstract Callback invoked when a camera device is disconnected. + + @param context + User-provided context. + + @param device + Media device that was disconnected. +*/ +typedef void (*VKTDeviceDisconnectHandler) (void* context, VKTDevice* device); + +/*! + @abstract Callback invoked with status of a media device permission request. + + @param context + User-provided context. + + @param status + Permission status. +*/ +typedef void (*VKTDevicePermissionHandler) (void* context, VKTDevicePermissionStatus status); +#pragma endregion + + +#pragma region --Client API-- +/*! + @function VKTDeviceRelease + + @abstract Release a media device. + + @discussion Release a media device. + + @param device + Media device. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceRelease (VKTDevice* device); + +/*! + @function VKTDeviceGetUniqueID + + @abstract Get the media device unique ID. + + @discussion Get the media device unique ID. + + @param device + Media device. + + @param destination + Destination UTF-8 string. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceGetUniqueID ( + VKTDevice* device, + char* destination +); + +/*! + @function VKTDeviceGetName + + @abstract Media device name. + + @discussion Media device name. + + @param device + Media device. + + @param destination + Destination UTF-8 string. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceGetName ( + VKTDevice* device, + char* destination +); + +/*! + @function VKTDeviceGetFlags + + @abstract Get the media device flags. + + @discussion Get the media device flags. + + @param device + Media device. + + @returns Device flags. +*/ +VKT_BRIDGE VKT_EXPORT VKTDeviceFlags VKT_API VKTDeviceGetFlags (VKTDevice* device); + +/*! + @function VKTDeviceIsRunning + + @abstract Is the device running? + + @discussion Is the device running? + + @param device + Media device. + + @returns True if media device is running. +*/ +VKT_BRIDGE VKT_EXPORT bool VKT_API VKTDeviceIsRunning (VKTDevice* device); + +/*! + @function VKTDeviceStartRunning + + @abstract Start running an media device. + + @discussion Start running an media device. + + @param device + Media device. + + @param sampleBufferHandler + Sample buffer delegate to receive sample buffers as the device produces them. + + @param context + User-provided context to be passed to the sample buffer delegate. Can be `NULL`. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceStartRunning ( + VKTDevice* device, + VKTSampleBufferHandler sampleBufferHandler, + void* context +); + +/*! + @function VKTDeviceStopRunning + + @abstract Stop running device. + + @discussion Stop running device. + + @param device + Media device. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceStopRunning (VKTDevice* device); + +/*! + @function VKTDeviceSetDisconnectHandler + + @abstract Set the device disconnect handler. + + @discussion Set the device disconnect handler. + This provided function pointer is invoked when the device is disconnected. + + @param device + Media device. + + @param disconnectHandler + Device disconnect handler. Can be `NULL`. + + @param context + User-provided context. Can be `NULL`. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceSetDisconnectHandler ( + VKTDevice* device, + VKTDeviceDisconnectHandler disconnectHandler, + void* context +); + +/*! + @function VKTDeviceCheckPermissions + + @abstract Check permissions for a given media device type. + + @discussion Check permissions for a given media device type. + + @param type + Permission type. + + @param request + Whether to request the permissions if the user has not been asked. + + @param handler + Permission delegate to receive result of permission request. + + @param context + User-provided context to be passed to the permission delegate. Can be `NULL`. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceCheckPermissions ( + VKTDevicePermissionType type, + bool request, + VKTDevicePermissionHandler handler, + void* context +); +#pragma endregion diff --git a/Plugins/iOS/VideoKit.framework/Headers/VKTMicrophone.h b/Plugins/iOS/VideoKit.framework/Headers/VKTMicrophone.h new file mode 100644 index 0000000..d755a01 --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/Headers/VKTMicrophone.h @@ -0,0 +1,213 @@ +// +// VKTMicrophone.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include + +#pragma region --Types-- +/*! + @typedef VKTMicrophone + + @abstract Audio input device. + + @discussion Audio input device. +*/ +typedef VKTDevice VKTMicrophone; + +/*! + @typedef VKTAudioBuffer + + @abstract Audio buffer. + + @discussion Audio buffer +*/ +typedef VKTSampleBuffer VKTAudioBuffer; +#pragma endregion + + +#pragma region --AudioDevice-- +/*! + @function VKTDiscoverMicrophones + + @abstract Discover available audio input devices. + + @discussion Discover available audio input devices. + + @param handler + Device handler. MUST NOT be `NULL`. + + @param context + Handler context. + + @returns Status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDiscoverMicrophones ( + VKTDeviceDiscoveryHandler handler, + void* context +); + +/*! + @function VKTMicrophoneGetEchoCancellation + + @abstract Get the device echo cancellation mode. + + @discussion Get the device echo cancellation mode. + + @param microphone + Microphone. + + @returns True if the device performs adaptive echo cancellation. + */ +VKT_BRIDGE VKT_EXPORT bool VKT_API VKTMicrophoneGetEchoCancellation (VKTMicrophone* microphone); + +/*! + @function VKTMicrophoneSetEchoCancellation + + @abstract Enable or disable echo cancellation on the device. + + @discussion If the device does not support echo cancellation, this will be a nop. + + @param microphone + Microphone. + + @param echoCancellation + Echo cancellation. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTMicrophoneSetEchoCancellation ( + VKTMicrophone* microphone, + bool echoCancellation +); + +/*! + @function VKTMicrophoneGetSampleRate + + @abstract Audio device sample rate. + + @discussion Audio device sample rate. + + @param microphone + Microphone. + + @returns Current sample rate. + */ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTMicrophoneGetSampleRate (VKTMicrophone* microphone); + +/*! + @function VKTMicrophoneSetSampleRate + + @abstract Set the audio device sample rate. + + @discussion Set the audio device sample rate. + + @param microphone + Microphone. + + @param sampleRate + Sample rate to set. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTMicrophoneSetSampleRate ( + VKTMicrophone* microphone, + int32_t sampleRate +); + +/*! + @function VKTMicrophoneGetChannelCount + + @abstract Audio device channel count. + + @discussion Audio device channel count. + + @param microphone + Microphone. + + @returns Current channel count. + */ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTMicrophoneGetChannelCount (VKTMicrophone* microphone); + +/*! + @function VKTMicrophoneSetChannelCount + + @abstract Set the audio device channel count. + + @discussion Set the audio device channel count. + + @param microphone + Microphone. + + @param channelCount + Channel count to set. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTMicrophoneSetChannelCount ( + VKTMicrophone* microphone, + int32_t channelCount +); +#pragma endregion + + +#pragma region --AudioBuffer-- +/*! + @function VKTAudioBufferGetData + + @abstract Get the audio data of an audio buffer. + + @discussion Get the audio data of an audio buffer. + + @param audioBuffer + Audio buffer. +*/ +VKT_BRIDGE VKT_EXPORT float* VKT_API VKTAudioBufferGetData (VKTAudioBuffer* audioBuffer); + +/*! + @function VKTAudioBufferGetSampleCount + + @abstract Get the total sample count of an audio buffer. + + @discussion Get the total sample count of an audio buffer. + + @param audioBuffer + Audio buffer. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTAudioBufferGetSampleCount (VKTAudioBuffer* audioBuffer); + +/*! + @function VKTAudioBufferGetSampleRate + + @abstract Get the sample rate of an audio buffer. + + @discussion Get the sample rate of an audio buffer. + + @param audioBuffer + Audio buffer. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTAudioBufferGetSampleRate (VKTAudioBuffer* audioBuffer); + +/*! + @function VKTAudioBufferGetChannelCount + + @abstract Get the channel count of an audio buffer. + + @discussion Get the channel count of an audio buffer. + + @param audioBuffer + Audio buffer. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTAudioBufferGetChannelCount (VKTAudioBuffer* audioBuffer); + +/*! + @function VKTAudioBufferGetTimestamp + + @abstract Get the timestamp of an audio buffer. + + @discussion Get the timestamp of an audio buffer. + + @param audioBuffer + Audio buffer. +*/ +VKT_BRIDGE VKT_EXPORT int64_t VKT_API VKTAudioBufferGetTimestamp (VKTAudioBuffer* audioBuffer); +#pragma endregion diff --git a/Plugins/iOS/VideoKit.framework/Headers/VKTRecorder.h b/Plugins/iOS/VideoKit.framework/Headers/VKTRecorder.h new file mode 100644 index 0000000..04d74ee --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/Headers/VKTRecorder.h @@ -0,0 +1,389 @@ +// +// VKTRecorder.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include +#include + +#pragma region --Types-- +/*! + @struct VKTRecorder + + @abstract Media recorder. + + @discussion Media recorder. +*/ +struct VKTRecorder; +typedef struct VKTRecorder VKTRecorder; + +/*! + @abstract Callback invoked with path to recorded media file. + + @param context + User context provided to recorder. + + @param path + Path to record file. If recording fails for any reason, this path will be `NULL`. +*/ +typedef void (*VKTRecordingHandler) (void* context, const char* path); +#pragma endregion + + +#pragma region --IMediaRecorder-- +/*! + @function VKTRecorderGetFrameSize + + @abstract Get the recorder's frame size. + + @discussion Get the recorder's frame size. + + @param recorder + Media recorder. + + @param outWidth + Output frame width. + + @param outHeight + Output frame height. + */ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderGetFrameSize ( + VKTRecorder* recorder, + int32_t* outWidth, + int32_t* outHeight +); + +/*! + @function VKTRecorderCommitFrame + + @abstract Commit a video frame to the recording. + + @discussion Commit a video frame to the recording. + + @param recorder + Media recorder. + + @param pixelBuffer + Pixel buffer containing a single video frame. + The pixel buffer MUST be laid out in RGBA8888 order (32 bits per pixel). + + @param timestamp + Frame timestamp. The spacing between consecutive timestamps determines the + effective framerate of some recorders. + */ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCommitFrame ( + VKTRecorder* recorder, + const uint8_t* pixelBuffer, + int64_t timestamp +); + +/*! + @function VKTRecorderCommitSamples + + @abstract Commit an audio frame to the recording. + + @discussion Commit an audio frame to the recording. + + @param recorder + Media recorder. + + @param sampleBuffer + Sample buffer containing a single audio frame. The sample buffer MUST be in 32-bit PCM + format, interleaved by channel for channel counts greater than 1. + + @param sampleCount + Total number of samples in the sample buffer. This should account for multiple channels. + + @param timestamp + Frame timestamp. The spacing between consecutive timestamps determines the + effective framerate of some recorders. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCommitSamples ( + VKTRecorder* recorder, + const float* sampleBuffer, + int32_t sampleCount, + int64_t timestamp +); + +/*! + @function VKTRecorderFinishWriting + + @abstract Finish writing and invoke the completion handler. + + @discussion Finish writing and invoke the completion handler. The recorder is automatically + released, along with any resources it owns. The recorder MUST NOT be used once this function + has been invoked. + + The completion handler will be invoked soon after this function is called. If recording fails for any reason, + the completion handler will receive `NULL` for the recording path. + + @param recorder + Media recorder. + + @param handler + Recording completion handler invoked once recording is completed. + + @param context + Context passed to completion handler. Can be `NULL`. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderFinishWriting ( + VKTRecorder* recorder, + VKTRecordingHandler handler, + void* context +); +#pragma endregion + + +#pragma region --Constructors-- +/*! + @function VKTRecorderCreateMP4 + + @abstract Create an MP4 recorder. + + @discussion Create an MP4 recorder that records with the H.264 AVC codec. + + @param path + Recording path. This path must be writable on the local file system. + + @param width + Video width. + + @param height + Video height. + + @param frameRate + Video frame rate. + + @param sampleRate + Audio sample rate. Pass 0 if recording without audio. + + @param channelCount + Audio channel count. Pass 0 if recording without audio. + + @param videoBitRate + Video bit rate in bits per second. + + @param keyframeInterval + Video keyframe interval in seconds. + + @param audioBitRate + Audio bit rate in bits per second. Ignored if no audio format is provided. + + @param recorder + Output media recorder. + + @returns Recorder creation status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCreateMP4 ( + const char* path, + int32_t width, + int32_t height, + float frameRate, + int32_t sampleRate, + int32_t channelCount, + int32_t videoBitRate, + int32_t keyframeInterval, + int32_t audioBitRate, + VKTRecorder** recorder +); + +/*! + @function VKTRecorderCreateHEVC + + @abstract Create an HEVC recorder. + + @discussion Create an MP4 recorder that records with the H.265 HEVC codec. + + @param path + Recording path. This path must be writable on the local file system. + + @param width + Video width. + + @param height + Video height. + + @param frameRate + Video frame rate. + + @param sampleRate + Audio sample rate. Pass 0 if recording without audio. + + @param channelCount + Audio channel count. Pass 0 if recording without audio. + + @param videoBitRate + Video bit rate in bits per second. + + @param keyframeInterval + Video keyframe interval in seconds. + + @param audioBitRate + Audio bit rate in bits per second. Ignored if no audio format is provided. + + @param recorder + Output media recorder. + + @returns Recorder creation status. + */ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCreateHEVC ( + const char* path, + int32_t width, + int32_t height, + float frameRate, + int32_t sampleRate, + int32_t channelCount, + int32_t videoBitRate, + int32_t keyframeInterval, + int32_t audioBitRate, + VKTRecorder** recorder +); + +/*! + @function VKTRecorderCreateWEBM + + @abstract Create a WEBM video recorder. + + @discussion Create a WEBM video recorder. + + @param path + Recording path. This path must be writable on the local file system. + + @param width + Video width. + + @param height + Vide. height. + + @param frameRate + Video frame rate. + + @param sampleRate + Audio sample rate. Pass 0 if recording without audio. + + @param channelCount + Audio channel count. Pass 0 if recording without audio. + + @param videoBitRate + Video bit rate in bits per second. + + @param keyframeInterval + Video keyframe interval in seconds. + + @param audioBitRate + Audio bit rate in bits per second. Ignored if no audio format is provided. + + @param recorder + Output media recorder. + + @returns Recorder creation status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCreateWEBM ( + const char* path, + int32_t width, + int32_t height, + float frameRate, + int32_t sampleRate, + int32_t channelCount, + int32_t videoBitRate, + int32_t keyframeInterval, + int32_t audioBitRate, + VKTRecorder** recorder +); + +/*! + @function VKTRecorderCreateGIF + + @abstract Create a GIF recorder. + + @discussion Create an animated GIF recorder. + The generated GIF image will loop forever. + + @param path + Recording path. This path must be writable on the local file system. + + @param width + Image width. + + @param height + Image height. + + @param delay + Per-frame delay in seconds. + + @param recorder + Output media recorder. + + @returns Recorder creation status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCreateGIF ( + const char* path, + int32_t width, + int32_t height, + float delay, + VKTRecorder** recorder +); + +/*! + @function VKTRecorderCreateWAV + + @abstract Create a WAV audio recorder. + + @discussion Create a WAV audio recorder. + + @param path + Recording path. This path must be writable on the local file system. + + @param sampleRate + Audio sample rate. + + @param channelCount + Audio channel count. + + @param recorder + Output media recorder. + + @returns Recorder creation status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCreateWAV ( + const char* path, + int32_t sampleRate, + int32_t channelCount, + VKTRecorder** recorder +); + +/*! + @function VKTRecorderCreateJPEG + + @abstract Create a JPEG image sequence recorder. + + @discussion Create a JPEG image sequence recorder. + The recorder returns a path separator-delimited list of image frame paths. + + @param width + Image width. + + @param height + Image height. + + @param compressionQuality + Image compression quality in range [0, 1]. + + @param recorder + Output media recorder. + + @returns Recorder creation status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCreateJPEG ( + const char* path, + int32_t width, + int32_t height, + float compressionQuality, + VKTRecorder** recorder +); +#pragma endregion \ No newline at end of file diff --git a/Plugins/iOS/VideoKit.framework/Headers/VKTSession.h b/Plugins/iOS/VideoKit.framework/Headers/VKTSession.h new file mode 100644 index 0000000..58e4842 --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/Headers/VKTSession.h @@ -0,0 +1,92 @@ +// +// VKTSession.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include + +#pragma region --Enumerations-- +/*! + @enum VKTStatus + + @abstract VideoKit status codes. + + @constant VKT_OK + Successful operation. + + @constant VKT_ERROR_INVALID_ARGUMENT + Provided argument is invalid. + + @constant VKT_ERROR_INVALID_OPERATION + Operation is invalid in current state. + + @constant VKT_ERROR_NOT_IMPLEMENTED + Operation has not been implemented. + + @constant VKT_ERROR_INVALID_SESSION + VideoKit session has not been set or is invalid. + + @constant VKT_ERROR_INVALID_PLAN + Current VideoKit plan does not allow the operation. + + @constant VKT_WARNING_LIMITED_PLAN + Current VideoKit plan only allows for limited functionality. + */ +enum VKTStatus { + VKT_OK = 0, + VKT_ERROR_INVALID_ARGUMENT = 1, + VKT_ERROR_INVALID_OPERATION = 2, + VKT_ERROR_NOT_IMPLEMENTED = 3, + VKT_ERROR_INVALID_SESSION = 101, + VKT_ERROR_INVALID_PLAN = 104, + VKT_WARNING_LIMITED_PLAN = 105, +}; +typedef enum VKTStatus VKTStatus; +#pragma endregion + + +#pragma region --Client API-- +/*! + @function VKTGetBundleIdentifier + + @abstract Get the application bundle ID for generating a session token. + + @discussion Get the application bundle ID for generating a session token. + + @param bundle + Destination bundle ID string. + + @returns Operation status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTGetBundleIdentifier (char * bundle); + +/*! + @function VKTGetSessionStatus + + @abstract Get the VideoKit session status. + + @discussion Get the VideoKit session status. + + @returns Session status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTGetSessionStatus (void); + +/*! + @function VKTSetSessionToken + + @abstract Set the VideoKit session token. + + @discussion Set the VideoKit session token. + + @param token + VideoKit session token. + + @returns Session status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTSetSessionToken (const char* token); +#pragma endregion diff --git a/Plugins/iOS/VideoKit.framework/Headers/VKTTypes.h b/Plugins/iOS/VideoKit.framework/Headers/VKTTypes.h new file mode 100644 index 0000000..1055b38 --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/Headers/VKTTypes.h @@ -0,0 +1,29 @@ +// +// VKTTypes.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#ifdef __cplusplus + #define VKT_BRIDGE extern "C" +#else + #define VKT_BRIDGE extern +#endif + +#ifdef _WIN64 + #define VKT_EXPORT __declspec(dllexport) +#else + #define VKT_EXPORT +#endif + +#ifdef __EMSCRIPTEN__ + #define VKT_API EMSCRIPTEN_KEEPALIVE +#elif defined(_WIN64) + #define VKT_API APIENTRY +#else + #define VKT_API +#endif \ No newline at end of file diff --git a/Plugins/iOS/VideoKit.framework/Headers/VideoKit.h b/Plugins/iOS/VideoKit.framework/Headers/VideoKit.h new file mode 100644 index 0000000..87a72d3 --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/Headers/VideoKit.h @@ -0,0 +1,17 @@ +// +// VideoKit.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include +#include +#include +#include +#include +#include +#include \ No newline at end of file diff --git a/Plugins/iOS/VideoKit.framework/Info.plist b/Plugins/iOS/VideoKit.framework/Info.plist new file mode 100644 index 0000000..b5d707e Binary files /dev/null and b/Plugins/iOS/VideoKit.framework/Info.plist differ diff --git a/Plugins/iOS/VideoKit.framework/Modules/module.modulemap b/Plugins/iOS/VideoKit.framework/Modules/module.modulemap new file mode 100644 index 0000000..8ebd5af --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/Modules/module.modulemap @@ -0,0 +1,6 @@ +framework module VideoKit { + umbrella header "VideoKit.h" + export * + + module * { export * } +} diff --git a/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTAudioDevice.h b/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTAudioDevice.h new file mode 100644 index 0000000..d2166a1 --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTAudioDevice.h @@ -0,0 +1,19 @@ +// +// VKTAudioDevice.h +// VideoKit +// +// Created by Yusuf Olokoba on 6/2/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#import "VKTMediaDevice.h" + +@interface VKTAudioDevice : NSObject +// Introspection +- (nonnull instancetype) initWithPort:(nonnull AVAudioSessionPortDescription*) port; +@property (readonly, nonnull) AVAudioSessionPortDescription* port; +// Settings +@property bool echoCancellation; +@property int sampleRate; +@property int channelCount; +@end diff --git a/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTAudioDeviceBuffer.h b/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTAudioDeviceBuffer.h new file mode 100644 index 0000000..9ed107a --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTAudioDeviceBuffer.h @@ -0,0 +1,15 @@ +// +// VKTAudioDeviceBuffer.h +// VideoKit +// +// Created by Yusuf Olokoba on 10/30/2021. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +@import AVFoundation; + +@interface VKTAudioDeviceBuffer : NSObject +@property (readonly, nonnull) AVAudioPCMBuffer* buffer; +@property (readonly) UInt64 timestamp; +- (nonnull instancetype) initWithBuffer:(AVAudioPCMBuffer* _Nonnull) buffer andTimestamp:(UInt64) timestamp; +@end diff --git a/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTCameraDevice.h b/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTCameraDevice.h new file mode 100644 index 0000000..cb8b880 --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTCameraDevice.h @@ -0,0 +1,23 @@ +// +// VKTCameraDevice.h +// VideoKit +// +// Created by Yusuf Olokoba on 6/2/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#import "VKTMediaDevice.h" + +@interface VKTCameraDevice : NSObject +// Introspection +- (nonnull instancetype) initWithDevice:(nonnull AVCaptureDevice*) device; +@property (readonly, nonnull) AVCaptureDevice* device; +@property (readonly) CGSize fieldOfView; +// Settings +@property CGSize previewResolution; +@property CGSize photoResolution; +@property int frameRate; +@property VKTFlashMode flashMode; +@property VKTVideoStabilizationMode videoStabilizationMode; +- (void) capturePhoto:(nonnull VKTSampleBufferBlock) photoBufferBlock; +@end diff --git a/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTCameraDeviceImage.h b/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTCameraDeviceImage.h new file mode 100644 index 0000000..110d5f1 --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTCameraDeviceImage.h @@ -0,0 +1,20 @@ +// +// VKTCameraDeviceImage.h +// VideoKit +// +// Created by Yusuf Olokoba on 10/30/2021. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +@import AVFoundation; + +@interface VKTCameraDeviceImage : NSObject +@property (readonly, nonnull) CVPixelBufferRef pixelBuffer; +@property (readonly) UInt64 timestamp; +@property (readonly) bool verticallyMirrored; +@property (readonly) bool hasIntrinsicMatrix; +@property (readonly) matrix_float3x3 intrinsicMatrix; +@property (readonly, nonnull) NSDictionary* metadata; +- (nonnull instancetype) initWithSampleBuffer:(nonnull CMSampleBufferRef) sampleBuffer andMirror:(bool) mirror; +- (nonnull instancetype) initWithPhoto:(nonnull AVCapturePhoto*) photo andMirror:(bool) mirror; +@end diff --git a/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTMediaDevice.h b/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTMediaDevice.h new file mode 100644 index 0000000..6b65152 --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTMediaDevice.h @@ -0,0 +1,26 @@ +// +// VKTMediaDevice.h +// VideoKit +// +// Created by Yusuf Olokoba on 1/3/2020. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +@import AVFoundation; +#include + +@protocol VKTMediaDevice; + +typedef void (^VKTSampleBufferBlock) (id _Nonnull sampleBuffer); +typedef void (^VKTDeviceDisconnectBlock) (id _Nonnull device); + +@protocol VKTMediaDevice +@required +@property (readonly, nonnull) NSString* uniqueID; +@property (readonly, nonnull) NSString* name; +@property (readonly) VKTDeviceFlags flags; +@property (readonly) bool running; +@property (nullable) VKTDeviceDisconnectBlock disconnectHandler; +- (void) startRunning:(nonnull VKTSampleBufferBlock) sampleBufferHandler; +- (void) stopRunning; +@end diff --git a/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTMediaRecorder.h b/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTMediaRecorder.h new file mode 100644 index 0000000..36162d1 --- /dev/null +++ b/Plugins/iOS/VideoKit.framework/PrivateHeaders/VKTMediaRecorder.h @@ -0,0 +1,18 @@ +// +// VKTNativeRecorder.h +// VideoKit +// +// Created by Yusuf Olokoba on 6/26/2018. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +@import AVFoundation; + +typedef void (^VKTRecordingCompletionBlock) (NSString* _Nullable url); + +@protocol VKTMediaRecorder +@property (readonly) CMVideoDimensions frameSize; +- (void) commitFrame:(nonnull CVPixelBufferRef) pixelBuffer timestamp:(int64_t) timestamp; +- (void) commitSamples:(nonnull const float*) sampleBuffer sampleCount:(int) sampleCount timestamp:(int64_t) timestamp; +- (void) finishWritingWithCompletionHandler:(nonnull VKTRecordingCompletionBlock) completionHandler; +@end diff --git a/Plugins/iOS/VideoKit.framework/VideoKit b/Plugins/iOS/VideoKit.framework/VideoKit new file mode 100755 index 0000000..14a720f Binary files /dev/null and b/Plugins/iOS/VideoKit.framework/VideoKit differ diff --git a/Plugins/macOS.meta b/Plugins/macOS.meta new file mode 100644 index 0000000..a8ad795 --- /dev/null +++ b/Plugins/macOS.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: e8f71437a9df24b8a8493931dbb94944 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/macOS/VideoKit.bundle.meta b/Plugins/macOS/VideoKit.bundle.meta new file mode 100644 index 0000000..1bdf038 --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle.meta @@ -0,0 +1,81 @@ +fileFormatVersion: 2 +guid: 2ef57c6f1012d41e6a4c5ba0a4487b67 +PluginImporter: + externalObjects: {} + serializedVersion: 2 + iconMap: {} + executionOrder: {} + defineConstraints: [] + isPreloaded: 0 + isOverridable: 1 + isExplicitlyReferenced: 0 + validateReferences: 1 + platformData: + - first: + : Any + second: + enabled: 0 + settings: + Exclude Android: 1 + Exclude Editor: 0 + Exclude Linux64: 1 + Exclude OSXUniversal: 0 + Exclude WebGL: 1 + Exclude Win: 1 + Exclude Win64: 1 + Exclude iOS: 1 + - first: + Android: Android + second: + enabled: 0 + settings: + CPU: ARMv7 + - first: + Any: + second: + enabled: 0 + settings: {} + - first: + Editor: Editor + second: + enabled: 1 + settings: + CPU: AnyCPU + DefaultValueInitialized: true + OS: OSX + - first: + Standalone: Linux64 + second: + enabled: 0 + settings: + CPU: AnyCPU + - first: + Standalone: OSXUniversal + second: + enabled: 1 + settings: + CPU: AnyCPU + - first: + Standalone: Win + second: + enabled: 0 + settings: + CPU: x86 + - first: + Standalone: Win64 + second: + enabled: 0 + settings: + CPU: x86_64 + - first: + iPhone: iOS + second: + enabled: 0 + settings: + AddToEmbeddedBinaries: false + CPU: AnyCPU + CompileFlags: + FrameworkDependencies: + userData: + assetBundleName: + assetBundleVariant: diff --git a/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTCamera.h b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTCamera.h new file mode 100644 index 0000000..b3d65f2 --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTCamera.h @@ -0,0 +1,1177 @@ +// +// VKTCamera.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include + +#pragma region --Enumerations-- +/*! + @enum VKTExposureMode + + @abstract Camera device exposure mode. + + @constant VKT_EXPOSURE_MODE_CONTINUOUS + Continuous auto exposure. + + @constant VKT_EXPOSURE_MODE_LOCKED + Locked exposure. Exposure settings will be fixed to their current values. + Requires `VKT_CAMERA_FLAG_LOCKED_EXPOSURE` device flag. + + @constant VKT_EXPOSURE_MODE_MANUAL + Manual exposure. User will set exposure duration and sensitivity. + Requires `VKT_CAMERA_FLAG_MANUAL_EXPOSURE` device flag. +*/ +enum VKTExposureMode { + VKT_EXPOSURE_MODE_CONTINUOUS = 0, + VKT_EXPOSURE_MODE_LOCKED = 1, + VKT_EXPOSURE_MODE_MANUAL = 2 +}; +typedef enum VKTExposureMode VKTExposureMode; + +/*! + @enum VKTFlashMode + + @abstract Camera device photo flash modes. + + @constant VKT_FLASH_MODE_OFF + The flash will never be fired. + + @constant VKT_FLASH_MODE_ON + The flash will always be fired. + + @constant VKT_FLASH_MODE_AUTO + The sensor will determine whether to fire the flash. +*/ +enum VKTFlashMode { + VKT_FLASH_MODE_OFF = 0, + VKT_FLASH_MODE_ON = 1, + VKT_FLASH_MODE_AUTO = 2 +}; +typedef enum VKTFlashMode VKTFlashMode; + +/*! + @enum VKTFocusMode + + @abstract Camera device focus mode. + + @constant VKT_FOCUS_MODE_CONTINUOUS + Continuous auto focus. + + @constant VKT_FOCUS_MODE_LOCKED + Locked auto focus. Focus settings will be fixed to their current values. + Requires `VKT_CAMERA_FLAG_FOCUS_LOCK` device flag. +*/ +enum VKTFocusMode { + VKT_FOCUS_MODE_CONTINUOUS = 0, + VKT_FOCUS_MODE_LOCKED = 1, +}; +typedef enum VKTFocusMode VKTFocusMode; + +/*! + @enum VKTImageFormat + + @abstract Camera image format. + + @constant VKT_IMAGE_FORMAT_UNKNOWN + Unknown or invalid format. + + @constant VKT_IMAGE_FORMAT_YCbCr420 + YUV semi-planar format. + + @constant VKT_IMAGE_FORMAT_RGBA8888 + RGBA8888 interleaved format. + + @constant VKT_IMAGE_FORMAT_BGRA8888 + BGRA8888 interleaved format. + */ +enum VKTImageFormat { + VKT_IMAGE_FORMAT_UNKNOWN = 0, + VKT_IMAGE_FORMAT_YCbCr420 = 1, + VKT_IMAGE_FORMAT_RGBA8888 = 2, + VKT_IMAGE_FORMAT_BGRA8888 = 3, +}; +typedef enum VKTImageFormat VKTImageFormat; + +/*! + @enum VKTImageOrientation + + @abstract Camera device frame orientation. + + @constant VKT_ORIENTATION_LAVKTSCAPE_LEFT + Landscape left. + + @constant VKT_ORIENTATION_PORTRAIT + Portrait. + + @constant VKT_ORIENTATION_LAVKTSCAPE_RIGHT + Landscape right. + + @constant VKT_ORIENTATION_PORTRAIT_UPSIDE_DOWN + Portrait upside down. +*/ +enum VKTImageOrientation { + VKT_ORIENTATION_LANDSCAPE_LEFT = 3, + VKT_ORIENTATION_PORTRAIT = 1, + VKT_ORIENTATION_LANDSCAPE_RIGHT = 4, + VKT_ORIENTATION_PORTRAIT_UPSIDE_DOWN = 2 +}; +typedef enum VKTImageOrientation VKTImageOrientation; + +/*! + @enum VKTMetadata + + @abstract Sample buffer metadata key. + + @constant VKT_METADATA_INTRINSIC_MATRIX + Camera intrinsic matrix. Value array must have enough capacity for 9 float values. + + @constant VKT_METADATA_EXPOSURE_BIAS + Camera image exposure bias value in EV. + + @constant VKT_METADATA_EXPOSURE_DURATION + Camera image exposure duration in seconds. + + @constant VKT_METADATA_FOCAL_LENGTH + Camera image focal length. + + @constant VKT_METADATA_F_NUMBER + Camera image aperture F-number. + + @constant VKT_METADATA_BRIGHTNESS + Camera image ambient brightness. +*/ +enum VKTMetadata { + VKT_METADATA_INTRINSIC_MATRIX = 1, + VKT_METADATA_EXPOSURE_BIAS = 2, + VKT_METADATA_EXPOSURE_DURATION = 3, + VKT_METADATA_FOCAL_LENGTH = 4, + VKT_METADATA_F_NUMBER = 5, + VKT_METADATA_BRIGHTNESS = 6, + VKT_METADATA_ISO = 7, +}; +typedef enum VKTMetadata VKTMetadata; + +/*! + @enum VKTTorchMode + + @abstract Camera device torch mode. + + @constant VKT_TORCH_MODE_OFF + Disabled torch mode. + + @constant VKT_TORCH_MODE_MAXIMUM + Maximum torch mode. + Requires `VKT_CAMERA_FLAG_TORCH` device flag. +*/ +enum VKTTorchMode { + VKT_TORCH_MODE_OFF = 0, + VKT_TORCH_MODE_MAXIMUM = 100, +}; +typedef enum VKTTorchMode VKTTorchMode; + +/*! + @enum VKTVideoStabilizationMode + + @abstract Camera device video stabilization mode. + + @constant VKT_VIDEO_STABILIZATION_OFF + Disabled video stabilization. + + @constant VKT_VIDEO_STABILIZATION_STANDARD + Standard video stabilization + Requires `VKT_CAMERA_FLAG_VIDEO_STABILIZATION` device flag. +*/ +enum VKTVideoStabilizationMode { + VKT_VIDEO_STABILIZATION_OFF = 0, + VKT_VIDEO_STABILIZATION_STANDARD = 1, +}; +typedef enum VKTVideoStabilizationMode VKTVideoStabilizationMode; + +/*! + @enum VKTWhiteBalanceMode + + @abstract Camera device white balance mode. + + @constant VKT_WHITE_BALANCE_MODE_CONTINUOUS + Continuous auto white balance. + + @constant VKT_WHITE_BALANCE_MODE_LOCKED + Locked auto white balance. White balance settings will be fixed to their current values. + Requires `VKT_CAMERA_FLAG_WHITE_BALANCE_LOCK` device flag. +*/ +enum VKTWhiteBalanceMode { + VKT_WHITE_BALANCE_MODE_CONTINUOUS = 0, + VKT_WHITE_BALANCE_MODE_LOCKED = 1, +}; +typedef enum VKTWhiteBalanceMode VKTWhiteBalanceMode; +#pragma endregion + + +#pragma region --Types-- +/*! + @typedef VKTCamera + + @abstract Camera device. + + @discussion Camera device. +*/ +typedef VKTDevice VKTCamera; + +/*! + @typedef VKTCameraImage + + @abstract Camera image. + + @discussion Camera image. +*/ +typedef VKTSampleBuffer VKTCameraImage; +#pragma endregion + + +#pragma region --CameraDevice-- +/*! + @function VKTDiscoverCameras + + @abstract Discover available camera devices. + + @discussion Discover available camera devices. + + @param handler + Device handler. MUST NOT be `NULL`. + + @param context + Handler context. + + @returns Status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDiscoverCameras ( + VKTDeviceDiscoveryHandler handler, + void* context +); + +/*! + @function VKTCameraGetFieldOfView + + @abstract Camera field of view in degrees. + + @discussion Camera field of view in degrees. + + @param camera + Camera device. + + @param outWidth + Output FOV width in degrees. + + @param outHeight + Output FOV height in degrees. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetFieldOfView ( + VKTCamera* camera, + float* outWidth, + float* outHeight +); + +/*! + @function VKTCameraGetExposureBiasRange + + @abstract Camera exposure bias range in EV. + + @discussion Camera exposure bias range in EV. + + @param camera + Camera device. + + @param outMin + Output minimum exposure bias. + + @param outMax + Output maximum exposure bias. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetExposureBiasRange ( + VKTCamera* camera, + float* outMin, + float* outMax +); + +/*! + @function VKTCameraGetExposureDurationRange + + @abstract Camera exposure duration range in seconds. + + @discussion Camera exposure duration range in seconds. + + @param camera + Camera device. + + @param outMin + Output minimum exposure duration in seconds. + + @param outMax + Output maximum exposure duration in seconds. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetExposureDurationRange ( + VKTCamera* camera, + float* outMin, + float* outMax +); + +/*! + @function VKTCameraGetISORange + + @abstract Camera sensor sensitivity range. + + @discussion Camera sensor sensitivity range. + + @param camera + Camera device. + + @param outMin + Output minimum ISO sensitivity value. + + @param outMax + Output maximum ISO sensitivity value. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetISORange ( + VKTCamera* camera, + float* outMin, + float* outMax +); + +/*! + @function VKTCameraGetZoomRange + + @abstract Camera optical zoom range. + + @discussion Camera optical zoom range. + + @param camera + Camera device. + + @param outMin + Output minimum zoom ratio. + + @param outMax + Output maximum zoom ratio. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetZoomRange ( + VKTCamera* camera, + float* outMin, + float* outMax +); + +/*! + @function VKTCameraGetPreviewResolution + + @abstract Get the camera preview resolution. + + @discussion Get the camera preview resolution. + + @param camera + Camera device. + + @param outWidth + Output width in pixels. + + @param outHeight + Output height in pixels. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetPreviewResolution ( + VKTCamera* camera, + int32_t* outWidth, + int32_t* outHeight +); + +/*! + @function VKTCameraSetPreviewResolution + + @abstract Set the camera preview resolution. + + @discussion Set the camera preview resolution. + + Most camera devices do not support arbitrary preview resolutions, so the camera will + set a supported resolution which is closest to the requested resolution that is specified. + + Note that this method should only be called before the camera preview is started. + + @param camera + Camera device. + + @param width + Width in pixels. + + @param height + Height in pixels. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetPreviewResolution ( + VKTCamera* camera, + int32_t width, + int32_t height +); + +/*! + @function VKTCameraGetPhotoResolution + + @abstract Get the camera photo resolution. + + @discussion Get the camera photo resolution. + + @param camera + Camera device. + + @param outWidth + Output width in pixels. + + @param outHeight + Output height in pixels. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraGetPhotoResolution ( + VKTCamera* camera, + int32_t* outWidth, + int32_t* outHeight +); + +/*! + @function VKTCameraSetPhotoResolution + + @abstract Set the camera photo resolution. + + @discussion Set the camera photo resolution. + + Most camera devices do not support arbitrary photo resolutions, so the camera will + set a supported resolution which is closest to the requested resolution that is specified. + + Note that this method should only be called before the camera preview is started. + + @param camera + Camera device. + + @param width + Width in pixels. + + @param height + Height in pixels. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetPhotoResolution ( + VKTCamera* camera, + int32_t width, + int32_t height +); + +/*! + @function VKTCameraGetFrameRate + + @abstract Get the camera preview frame rate. + + @discussion Get the camera preview frame rate. + + @param camera + Camera device. + + @returns Camera preview frame rate. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraGetFrameRate (VKTCamera* camera); + +/*! + @function VKTCameraSetFrameRate + + @abstract Set the camera preview frame rate. + + @discussion Set the camera preview frame rate. + + Note that this method should only be called before the camera preview is started. + + @param camera + Camera device. + + @param frameRate + Frame rate to set. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetFrameRate ( + VKTCamera* camera, + int32_t frameRate +); + +/*! + @function VKTCameraGetExposureMode + + @abstract Get the camera exposure mode. + + @discussion Get the camera exposure mode. + + @param camera + Camera device. + + @returns Exposure mode. +*/ +VKT_BRIDGE VKT_EXPORT VKTExposureMode VKT_API VKTCameraGetExposureMode (VKTCamera* camera); + +/*! + @function VKTCameraSetExposureMode + + @abstract Set the camera exposure mode. + + @discussion Set the camera exposure mode. + + @param camera + Camera device. + + @param mode + Exposure mode. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetExposureMode ( + VKTCamera* camera, + VKTExposureMode mode +); + +/*! + @function VKTCameraGetExposureBias + + @abstract Get the camera exposure bias. + + @discussion Get the camera exposure bias. + + @param camera + Camera device. + + @returns Camera exposure bias. + */ +VKT_BRIDGE VKT_EXPORT float VKT_API VKTCameraGetExposureBias (VKTCamera* camera); + +/*! + @function VKTCameraSetExposureBias + + @abstract Set the camera exposure bias. + + @discussion Set the camera exposure bias. + + Note that the value MUST be in the camera exposure range. + + @param camera + Camera device. + + @param bias + Exposure bias value to set. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetExposureBias ( + VKTCamera* camera, + float bias +); + +/*! + @function VKTCameraGetExposureDuration + + @abstract Get the camera exposure duration. + + @discussion Get the camera exposure duration. + + @param camera + Camera device. + + @returns Camera exposure duration in seconds. + */ +VKT_BRIDGE VKT_EXPORT float VKT_API VKTCameraGetExposureDuration (VKTCamera* camera); + +/*! + @function VKTCameraGetISO + + @abstract Get the camera sensitivity. + + @discussion Get the camera sensitivity. + + @param camera + Camera device. + + @returns Camera sensitivity. + */ +VKT_BRIDGE VKT_EXPORT float VKT_API VKTCameraGetISO (VKTCamera* camera); + +/*! + @function VKTCameraSetExposureDuration + + @abstract Set the camera exposure duration. + + @discussion Set the camera exposure duration. + This method will automatically change the camera's exposure mode to `MANUAL`. + + @param camera + Camera device. + + @param duration + Exposure duration in seconds. MUST be in `ExposureDurationRange`. + + @param ISO + Shutter sensitivity. MUST be in `ISORange`. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetExposureDuration ( + VKTCamera* camera, + float duration, + float ISO +); + +/*! + @function VKTCameraSetExposurePoint + + @abstract Set the camera exposure point of interest. + + @discussion Set the camera exposure point of interest. + The coordinates are specified in viewport space, with each value in range [0., 1.]. + + @param camera + Camera device. + + @param x + Exposure point x-coordinate in viewport space. + + @param y + Exposure point y-coordinate in viewport space. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetExposurePoint ( + VKTCamera* camera, + float x, + float y +); + +/*! + @function VKTCameraGetFlashMode + + @abstract Get the camera photo flash mode. + + @discussion Get the camera photo flash mode. + + @param camera + Camera device. + + @returns Camera photo flash mode. + */ +VKT_BRIDGE VKT_EXPORT VKTFlashMode VKT_API VKTCameraGetFlashMode (VKTCamera* camera); + +/*! + @function VKTCameraSetFlashMode + + @abstract Set the camera photo flash mode. + + @discussion Set the camera photo flash mode. + + @param camera + Camera device. + + @param mode + Flash mode to set. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetFlashMode ( + VKTCamera* camera, + VKTFlashMode mode +); + +/*! + @function VKTCameraGetFocusMode + + @abstract Get the camera focus mode. + + @discussion Get the camera focus mode. + + @param camera + Camera device. + + @returns Camera focus mode. + */ +VKT_BRIDGE VKT_EXPORT VKTFocusMode VKT_API VKTCameraGetFocusMode (VKTCamera* camera); + +/*! + @function VKTCameraSetFocusMode + + @abstract Set the camera focus mode. + + @discussion Set the camera focus mode. + + @param camera + Camera device. + + @param mode + Focus mode. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetFocusMode ( + VKTCamera* camera, + VKTFocusMode mode +); + +/*! + @function VKTCameraSetFocusPoint + + @abstract Set the camera focus point. + + @discussion Set the camera focus point of interest. + The coordinates are specified in viewport space, with each value in range [0., 1.]. + This function should only be used if the camera supports setting the focus point. + See `VKTCameraFocusPointSupported`. + + @param camera + Camera device. + + @param x + Focus point x-coordinate in viewport space. + + @param y + Focus point y-coordinate in viewport space. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetFocusPoint ( + VKTCamera* camera, + float x, + float y +); + +/*! + @function VKTCameraGetTorchMode + + @abstract Get the current camera torch mode. + + @discussion Get the current camera torch mode. + + @param camera + Camera device. + + @returns Current camera torch mode. + */ +VKT_BRIDGE VKT_EXPORT VKTTorchMode VKT_API VKTCameraGetTorchMode (VKTCamera* camera); + +/*! + @function VKTCameraSetTorchMode + + @abstract Set the camera torch mode. + + @discussion Set the camera torch mode. + + @param camera + Camera device. + + @param mode + Torch mode. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetTorchMode ( + VKTCamera* camera, + VKTTorchMode mode +); + +/*! + @function VKTCameraGetWhiteBalanceMode + + @abstract Get the camera white balance mode. + + @discussion Get the camera white balance mode. + + @param camera + Camera device. + + @returns White balance mode. + */ +VKT_BRIDGE VKT_EXPORT VKTWhiteBalanceMode VKT_API VKTCameraGetWhiteBalanceMode (VKTCamera* camera); + +/*! + @function VKTCameraSetWhiteBalanceMode + + @abstract Set the camera white balance mode. + + @discussion Set the camera white balance mode. + + @param camera + Camera device. + + @param mode + White balance mode. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetWhiteBalanceMode ( + VKTCamera* camera, + VKTWhiteBalanceMode mode +); + +/*! + @function VKTCameraGetVideoStabilizationMode + + @abstract Get the camera video stabilization mode. + + @discussion Get the camera video stabilization mode. + + @param camera + Camera device. + + @returns Video stabilization mode. + */ +VKT_BRIDGE VKT_EXPORT VKTVideoStabilizationMode VKT_API VKTCameraGetVideoStabilizationMode (VKTCamera* camera); + +/*! + @function VKTCameraSetVideoStabilizationMode + + @abstract Set the camera video stabilization mode. + + @discussion Set the camera video stabilization mode. + + @param camera + Camera device. + + @param mode + Video stabilization mode. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetVideoStabilizationMode ( + VKTCamera* camera, + VKTVideoStabilizationMode mode +); + +/*! + @function VKTCameraGetZoomRatio + + @abstract Get the camera zoom ratio. + + @discussion Get the camera zoom ratio. + This value will always be within the minimum and maximum zoom values reported by the camera device. + + @param camera + Camera device. + + @returns Zoom ratio. + */ +VKT_BRIDGE VKT_EXPORT float VKT_API VKTCameraGetZoomRatio (VKTCamera* camera); + +/*! + @function VKTCameraSetZoomRatio + + @abstract Set the camera zoom ratio. + + @discussion Set the camera zoom ratio. + This value must always be within the minimum and maximum zoom values reported by the camera device. + + @param camera + Camera device. + + @param ratio + Zoom ratio. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraSetZoomRatio ( + VKTCamera* camera, + float ratio +); + +/*! + @function VKTCameraCapturePhoto + + @abstract Capture a still photo. + + @discussion Capture a still photo. + + @param camera + Camera device. + + @param handler + Photo handler. + + @param context + User-provided context. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraCapturePhoto ( + VKTCamera* camera, + VKTSampleBufferHandler handler, + void* context +); +#pragma endregion + + +#pragma region --CameraImage-- +/*! + @function VKTCameraImageGetData + + @abstract Get the image data of a camera image. + + @discussion Get the image data of a camera image. + If the camera image uses a planar format, this will return `NULL`. + + @param cameraImage + Camera image. +*/ +VKT_BRIDGE VKT_EXPORT void* VKT_API VKTCameraImageGetData (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetDataSize + + @abstract Get the image data size of a camera image in bytes. + + @discussion Get the image data size of a camera image in bytes. + + @param cameraImage + Camera image. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetDataSize (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetFormat + + @abstract Get the format of a camera image. + + @discussion Get the format of a camera image. + + @param cameraImage + Camera image. +*/ +VKT_BRIDGE VKT_EXPORT VKTImageFormat VKT_API VKTCameraImageGetFormat (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetWidth + + @abstract Get the width of a camera image. + + @discussion Get the width of a camera image. + + @param cameraImage + Camera image. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetWidth (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetHeight + + @abstract Get the height of a camera image. + + @discussion Get the height of a camera image. + + @param cameraImage + Camera image. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetHeight (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetRowStride + + @abstract Get the row stride of a camera image in bytes. + + @discussion Get the row stride of a camera image in bytes. + + @param cameraImage + Camera image. + + @returns Row stride in bytes. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetRowStride (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetTimestamp + + @abstract Get the timestamp of a camera image. + + @discussion Get the timestamp of a camera image. + + @param cameraImage + Camera image. + + @returns Image timestamp in nanoseconds. +*/ +VKT_BRIDGE VKT_EXPORT int64_t VKT_API VKTCameraImageGetTimestamp (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetVerticallyMirrored + + @abstract Whether the camera image is vertically mirrored. + + @discussion Whether the camera image is vertically mirrored. + + @param cameraImage + Camera image. +*/ +VKT_BRIDGE VKT_EXPORT bool VKT_API VKTCameraImageGetVerticallyMirrored (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetPlaneCount + + @abstract Get the plane count of a camera image. + + @discussion Get the plane count of a camera image. + If the image uses an interleaved format or only has a single plane, this function returns zero. + + @param cameraImage + Camera image. + + @returns Number of planes in image. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetPlaneCount (VKTCameraImage* cameraImage); + +/*! + @function VKTCameraImageGetPlaneData + + @abstract Get the plane data for a given plane of a camera image. + + @discussion Get the plane data for a given plane of a camera image. + + @param cameraImage + Camera image. + + @param planeIdx + Plane index. +*/ +VKT_BRIDGE VKT_EXPORT void* VKT_API VKTCameraImageGetPlaneData ( + VKTCameraImage* cameraImage, + int32_t planeIdx +); + +/*! + @function VKTCameraImageGetPlaneDataSize + + @abstract Get the plane data size of a given plane of a camera image. + + @discussion Get the plane data size of a given plane of a camera image. + + @param cameraImage + Camera image. + + @param planeIdx + Plane index. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetPlaneDataSize ( + VKTCameraImage* cameraImage, + int32_t planeIdx +); + +/*! + @function VKTCameraImageGetPlaneWidth + + @abstract Get the width of a given plane of a camera image. + + @discussion Get the width of a given plane of a camera image. + + @param cameraImage + Camera image. + + @param planeIdx + Plane index. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetPlaneWidth ( + VKTCameraImage* cameraImage, + int32_t planeIdx +); + +/*! + @function VKTCameraImageGetPlaneHeight + + @abstract Get the height of a given plane of a camera image. + + @discussion Get the height of a given plane of a camera image. + + @param cameraImage + Camera image. + + @param planeIdx + Plane index. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetPlaneHeight ( + VKTCameraImage* cameraImage, + int32_t planeIdx +); + +/*! + @function VKTCameraImageGetPlanePixelStride + + @abstract Get the plane pixel stride for a given plane of a camera image. + + @discussion Get the plane pixel stride for a given plane of a camera image. + + @param cameraImage + Camera image. + + @param planeIdx + Plane index. + + @returns Plane pixel stride in bytes. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetPlanePixelStride ( + VKTCameraImage* cameraImage, + int32_t planeIdx +); + +/*! + @function VKTCameraImageGetPlaneRowStride + + @abstract Get the plane row stride for a given plane of a camera image. + + @discussion Get the plane row stride for a given plane of a camera image. + + @param cameraImage + Camera image. + + @param planeIdx + Plane index. + + @returns Plane row stride in bytes. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTCameraImageGetPlaneRowStride ( + VKTCameraImage* cameraImage, + int32_t planeIdx +); + +/*! + @function VKTCameraImageGetMetadata + + @abstract Get the metadata value for a given key in a camera image. + + @discussion Get the metadata value for a given key in a camera image. + + @param cameraImage + Camera image. + + @param key + Metadata key. + + @param value + Destination value array. + + @param count + Destination value array size. + + @returns Whether the metadata key was successfully looked up. +*/ +VKT_BRIDGE VKT_EXPORT bool VKT_API VKTCameraImageGetMetadata ( + VKTCameraImage* cameraImage, + VKTMetadata key, + float* value, + int32_t count +); + +/*! + @function VKTCameraImageConvertToRGBA8888 + + @abstract Convert a camera image to an RGBA8888 pixel buffer. + + @discussion Convert a camera image to an RGBA8888 pixel buffer. + + @param cameraImage + Camera image. + + @param orientation + Desired image orientation. + + @param mirror + Whether to vertically mirror the pixel buffer. + + @param tempBuffer + Temporary pixel buffer for intermediate conversions. Pass `NULL` to use short-time allocations. + + @param dstBuffer + Destination pixel buffer. This must be at least `width * height * 4` bytes large. + + @param dstWidth + Output pixel buffer width. + + @param dstHeight + Output pixel buffer height. +*/ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTCameraImageConvertToRGBA8888 ( + VKTCameraImage* cameraImage, + VKTImageOrientation orientation, + bool mirror, + uint8_t* tempBuffer, + uint8_t* dstBuffer, + int32_t* dstWidth, + int32_t* dstHeight +); +#pragma endregion diff --git a/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTDevice.h b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTDevice.h new file mode 100644 index 0000000..6e3eb5a --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTDevice.h @@ -0,0 +1,369 @@ +// +// VKTDevice.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include +#include +#include + +#pragma region --Enumerations-- +/*! + @enum VKTDeviceFlags + + @abstract Immutable properties of media devices. + + @constant VKT_DEVICE_FLAG_INTERNAL + Device is internal. + + @constant VKT_DEVICE_FLAG_EXTERNAL + Device is external. + + @constant VKT_DEVICE_FLAG_DEFAULT + Device is the default device for its media type. + + @constant VKT_MICROPHONE_FLAG_ECHO_CANCELLATION + Audio device supports echo cancellation. + + @constant VKT_CAMERA_FLAG_FRONT_FACING + Camera device is front-facing. + + @constant VKT_CAMERA_FLAG_FLASH + Camera device supports flash when capturing photos. + + @constant VKT_CAMERA_FLAG_TORCH + Camera device supports torch. + + @constant VKT_CAMERA_FLAG_DEPTH + Camera device supports depth streaming. + + @constant VKT_CAMERA_FLAG_EXPOSURE_CONTINUOUS + Camera device supports continuous auto exposure. + + @constant VKT_CAMERA_FLAG_EXPOSURE_LOCK + Camera device supports locked auto exposure. + + @constant VKT_CAMERA_FLAG_EXPOSURE_MANUAL + Camera device supports manual exposure. + + @constant VKT_CAMERA_FLAG_EXPOSURE_POINT + Camera device supports setting exposure point. + + @constant VKT_CAMERA_FLAG_FOCUS_CONTINUOUS + Camera device supports continuous auto exposure. + + @constant VKT_CAMERA_FLAG_LOCKED_FOCUS + Camera device supports locked auto focus. + + @constant VKT_CAMERA_FLAG_FOCUS_POINT + Camera device supports setting focus point. + + @constant VKT_CAMERA_FLAG_WHITE_BALANCE_CONTINUOUS + Camera device supports continuous auto white balance. + + @constant VKT_CAMERA_FLAG_WHITE_BALANCE_LOCK + Camera device supports locked auto white balance. + + @constant VKT_CAMERA_FLAG_VIDEO_STABILIZATION + Camera device supports video stabilization. +*/ +enum VKTDeviceFlags { + VKT_DEVICE_FLAG_INTERNAL = 1 << 0, + VKT_DEVICE_FLAG_EXTERNAL = 1 << 1, + VKT_DEVICE_FLAG_DEFAULT = 1 << 3, + VKT_MICROPHONE_FLAG_ECHO_CANCELLATION = 1 << 2, + VKT_CAMERA_FLAG_FRONT_FACING = 1 << 6, + VKT_CAMERA_FLAG_FLASH = 1 << 7, + VKT_CAMERA_FLAG_TORCH = 1 << 8, + VKT_CAMERA_FLAG_DEPTH = 1 << 15, + VKT_CAMERA_FLAG_EXPOSURE_CONTINUOUS = 1 << 16, + VKT_CAMERA_FLAG_EXPOSURE_LOCK = 1 << 11, + VKT_CAMERA_FLAG_EXPOSURE_MANUAL = 1 << 14, + VKT_CAMERA_FLAG_EXPOSURE_POINT = 1 << 9, + VKT_CAMERA_FLAG_FOCUS_CONTINUOUS = 1 << 17, + VKT_CAMERA_FLAG_FOCUS_LOCK = 1 << 12, + VKT_CAMERA_FLAG_FOCUS_POINT = 1 << 10, + VKT_CAMERA_FLAG_WHITE_BALANCE_CONTINUOUS = 1 << 18, + VKT_CAMERA_FLAG_WHITE_BALANCE_LOCK = 1 << 13, + VKT_CAMERA_FLAG_VIDEO_STABILIZATION = 1 << 19, +}; +typedef enum VKTDeviceFlags VKTDeviceFlags; + +/*! + @enum VKTDevicePermissionType + + @abstract Media device permission type. + + @constant VKT_DEVICE_PERMISSION_TYPE_MICROPHONE + Request microphone permissions. + + @constant VKT_DEVICE_PERMISSION_TYPE_CAMERA + Request camera permissions. +*/ +enum VKTDevicePermissionType { + VKT_DEVICE_PERMISSION_TYPE_MICROPHONE = 1, + VKT_DEVICE_PERMISSION_TYPE_CAMERA = 2, +}; +typedef enum VKTDevicePermissionType VKTDevicePermissionType; + +/*! + @enum VKTDevicePermissionStatus + + @abstract Media device permission status. + + @constant VKT_DEVICE_PERMISSION_STATUS_UNKNOWN + User has not authorized or denied access to media device. + + @constant VKT_DEVICE_PERMISSION_STATUS_DENIED + User has denied access to media device. + + @constant VKT_DEVICE_PERMISSION_STATUS_AUTHORIZED + User has authorized access to media device. +*/ +enum VKTDevicePermissionStatus { + VKT_DEVICE_PERMISSION_STATUS_UNKNOWN = 0, + VKT_DEVICE_PERMISSION_STATUS_DENIED = 2, + VKT_DEVICE_PERMISSION_STATUS_AUTHORIZED = 3, +}; +typedef enum VKTDevicePermissionStatus VKTDevicePermissionStatus; +#pragma endregion + + +#pragma region --Types-- +/*! + @struct VKTDevice + + @abstract Media device. + + @discussion Media device. +*/ +struct VKTDevice; +typedef struct VKTDevice VKTDevice; + +/*! + @struct VKTSampleBuffer + + @abstract Sample buffer generated by a media device. + + @discussion Sample buffer generated by a media device. +*/ +struct VKTSampleBuffer; +typedef struct VKTSampleBuffer VKTSampleBuffer; +#pragma endregion + + +#pragma region --Delegates-- +/*! + @abstract Callback invoked with discovered media devices. + + @param context + User-provided context. + + @param devices + Discovered devices. These MUST be released when they are no longer needed. + + @param count + Discovered device count. +*/ +typedef void (*VKTDeviceDiscoveryHandler) (void* context, VKTDevice** devices, int32_t count); + +/*! + @abstract Callback invoked with new sample buffer from a media device. + + @param context + User-provided context. + + @param sampleBuffer + Sample buffer. +*/ +typedef void (*VKTSampleBufferHandler) (void* context, VKTSampleBuffer* sampleBuffer); + +/*! + @abstract Callback invoked when a camera device is disconnected. + + @param context + User-provided context. + + @param device + Media device that was disconnected. +*/ +typedef void (*VKTDeviceDisconnectHandler) (void* context, VKTDevice* device); + +/*! + @abstract Callback invoked with status of a media device permission request. + + @param context + User-provided context. + + @param status + Permission status. +*/ +typedef void (*VKTDevicePermissionHandler) (void* context, VKTDevicePermissionStatus status); +#pragma endregion + + +#pragma region --Client API-- +/*! + @function VKTDeviceRelease + + @abstract Release a media device. + + @discussion Release a media device. + + @param device + Media device. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceRelease (VKTDevice* device); + +/*! + @function VKTDeviceGetUniqueID + + @abstract Get the media device unique ID. + + @discussion Get the media device unique ID. + + @param device + Media device. + + @param destination + Destination UTF-8 string. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceGetUniqueID ( + VKTDevice* device, + char* destination +); + +/*! + @function VKTDeviceGetName + + @abstract Media device name. + + @discussion Media device name. + + @param device + Media device. + + @param destination + Destination UTF-8 string. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceGetName ( + VKTDevice* device, + char* destination +); + +/*! + @function VKTDeviceGetFlags + + @abstract Get the media device flags. + + @discussion Get the media device flags. + + @param device + Media device. + + @returns Device flags. +*/ +VKT_BRIDGE VKT_EXPORT VKTDeviceFlags VKT_API VKTDeviceGetFlags (VKTDevice* device); + +/*! + @function VKTDeviceIsRunning + + @abstract Is the device running? + + @discussion Is the device running? + + @param device + Media device. + + @returns True if media device is running. +*/ +VKT_BRIDGE VKT_EXPORT bool VKT_API VKTDeviceIsRunning (VKTDevice* device); + +/*! + @function VKTDeviceStartRunning + + @abstract Start running an media device. + + @discussion Start running an media device. + + @param device + Media device. + + @param sampleBufferHandler + Sample buffer delegate to receive sample buffers as the device produces them. + + @param context + User-provided context to be passed to the sample buffer delegate. Can be `NULL`. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceStartRunning ( + VKTDevice* device, + VKTSampleBufferHandler sampleBufferHandler, + void* context +); + +/*! + @function VKTDeviceStopRunning + + @abstract Stop running device. + + @discussion Stop running device. + + @param device + Media device. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceStopRunning (VKTDevice* device); + +/*! + @function VKTDeviceSetDisconnectHandler + + @abstract Set the device disconnect handler. + + @discussion Set the device disconnect handler. + This provided function pointer is invoked when the device is disconnected. + + @param device + Media device. + + @param disconnectHandler + Device disconnect handler. Can be `NULL`. + + @param context + User-provided context. Can be `NULL`. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceSetDisconnectHandler ( + VKTDevice* device, + VKTDeviceDisconnectHandler disconnectHandler, + void* context +); + +/*! + @function VKTDeviceCheckPermissions + + @abstract Check permissions for a given media device type. + + @discussion Check permissions for a given media device type. + + @param type + Permission type. + + @param request + Whether to request the permissions if the user has not been asked. + + @param handler + Permission delegate to receive result of permission request. + + @param context + User-provided context to be passed to the permission delegate. Can be `NULL`. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDeviceCheckPermissions ( + VKTDevicePermissionType type, + bool request, + VKTDevicePermissionHandler handler, + void* context +); +#pragma endregion diff --git a/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTMicrophone.h b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTMicrophone.h new file mode 100644 index 0000000..d755a01 --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTMicrophone.h @@ -0,0 +1,213 @@ +// +// VKTMicrophone.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include + +#pragma region --Types-- +/*! + @typedef VKTMicrophone + + @abstract Audio input device. + + @discussion Audio input device. +*/ +typedef VKTDevice VKTMicrophone; + +/*! + @typedef VKTAudioBuffer + + @abstract Audio buffer. + + @discussion Audio buffer +*/ +typedef VKTSampleBuffer VKTAudioBuffer; +#pragma endregion + + +#pragma region --AudioDevice-- +/*! + @function VKTDiscoverMicrophones + + @abstract Discover available audio input devices. + + @discussion Discover available audio input devices. + + @param handler + Device handler. MUST NOT be `NULL`. + + @param context + Handler context. + + @returns Status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTDiscoverMicrophones ( + VKTDeviceDiscoveryHandler handler, + void* context +); + +/*! + @function VKTMicrophoneGetEchoCancellation + + @abstract Get the device echo cancellation mode. + + @discussion Get the device echo cancellation mode. + + @param microphone + Microphone. + + @returns True if the device performs adaptive echo cancellation. + */ +VKT_BRIDGE VKT_EXPORT bool VKT_API VKTMicrophoneGetEchoCancellation (VKTMicrophone* microphone); + +/*! + @function VKTMicrophoneSetEchoCancellation + + @abstract Enable or disable echo cancellation on the device. + + @discussion If the device does not support echo cancellation, this will be a nop. + + @param microphone + Microphone. + + @param echoCancellation + Echo cancellation. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTMicrophoneSetEchoCancellation ( + VKTMicrophone* microphone, + bool echoCancellation +); + +/*! + @function VKTMicrophoneGetSampleRate + + @abstract Audio device sample rate. + + @discussion Audio device sample rate. + + @param microphone + Microphone. + + @returns Current sample rate. + */ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTMicrophoneGetSampleRate (VKTMicrophone* microphone); + +/*! + @function VKTMicrophoneSetSampleRate + + @abstract Set the audio device sample rate. + + @discussion Set the audio device sample rate. + + @param microphone + Microphone. + + @param sampleRate + Sample rate to set. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTMicrophoneSetSampleRate ( + VKTMicrophone* microphone, + int32_t sampleRate +); + +/*! + @function VKTMicrophoneGetChannelCount + + @abstract Audio device channel count. + + @discussion Audio device channel count. + + @param microphone + Microphone. + + @returns Current channel count. + */ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTMicrophoneGetChannelCount (VKTMicrophone* microphone); + +/*! + @function VKTMicrophoneSetChannelCount + + @abstract Set the audio device channel count. + + @discussion Set the audio device channel count. + + @param microphone + Microphone. + + @param channelCount + Channel count to set. + */ +VKT_BRIDGE VKT_EXPORT void VKT_API VKTMicrophoneSetChannelCount ( + VKTMicrophone* microphone, + int32_t channelCount +); +#pragma endregion + + +#pragma region --AudioBuffer-- +/*! + @function VKTAudioBufferGetData + + @abstract Get the audio data of an audio buffer. + + @discussion Get the audio data of an audio buffer. + + @param audioBuffer + Audio buffer. +*/ +VKT_BRIDGE VKT_EXPORT float* VKT_API VKTAudioBufferGetData (VKTAudioBuffer* audioBuffer); + +/*! + @function VKTAudioBufferGetSampleCount + + @abstract Get the total sample count of an audio buffer. + + @discussion Get the total sample count of an audio buffer. + + @param audioBuffer + Audio buffer. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTAudioBufferGetSampleCount (VKTAudioBuffer* audioBuffer); + +/*! + @function VKTAudioBufferGetSampleRate + + @abstract Get the sample rate of an audio buffer. + + @discussion Get the sample rate of an audio buffer. + + @param audioBuffer + Audio buffer. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTAudioBufferGetSampleRate (VKTAudioBuffer* audioBuffer); + +/*! + @function VKTAudioBufferGetChannelCount + + @abstract Get the channel count of an audio buffer. + + @discussion Get the channel count of an audio buffer. + + @param audioBuffer + Audio buffer. +*/ +VKT_BRIDGE VKT_EXPORT int32_t VKT_API VKTAudioBufferGetChannelCount (VKTAudioBuffer* audioBuffer); + +/*! + @function VKTAudioBufferGetTimestamp + + @abstract Get the timestamp of an audio buffer. + + @discussion Get the timestamp of an audio buffer. + + @param audioBuffer + Audio buffer. +*/ +VKT_BRIDGE VKT_EXPORT int64_t VKT_API VKTAudioBufferGetTimestamp (VKTAudioBuffer* audioBuffer); +#pragma endregion diff --git a/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTRecorder.h b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTRecorder.h new file mode 100644 index 0000000..04d74ee --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTRecorder.h @@ -0,0 +1,389 @@ +// +// VKTRecorder.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include +#include + +#pragma region --Types-- +/*! + @struct VKTRecorder + + @abstract Media recorder. + + @discussion Media recorder. +*/ +struct VKTRecorder; +typedef struct VKTRecorder VKTRecorder; + +/*! + @abstract Callback invoked with path to recorded media file. + + @param context + User context provided to recorder. + + @param path + Path to record file. If recording fails for any reason, this path will be `NULL`. +*/ +typedef void (*VKTRecordingHandler) (void* context, const char* path); +#pragma endregion + + +#pragma region --IMediaRecorder-- +/*! + @function VKTRecorderGetFrameSize + + @abstract Get the recorder's frame size. + + @discussion Get the recorder's frame size. + + @param recorder + Media recorder. + + @param outWidth + Output frame width. + + @param outHeight + Output frame height. + */ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderGetFrameSize ( + VKTRecorder* recorder, + int32_t* outWidth, + int32_t* outHeight +); + +/*! + @function VKTRecorderCommitFrame + + @abstract Commit a video frame to the recording. + + @discussion Commit a video frame to the recording. + + @param recorder + Media recorder. + + @param pixelBuffer + Pixel buffer containing a single video frame. + The pixel buffer MUST be laid out in RGBA8888 order (32 bits per pixel). + + @param timestamp + Frame timestamp. The spacing between consecutive timestamps determines the + effective framerate of some recorders. + */ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCommitFrame ( + VKTRecorder* recorder, + const uint8_t* pixelBuffer, + int64_t timestamp +); + +/*! + @function VKTRecorderCommitSamples + + @abstract Commit an audio frame to the recording. + + @discussion Commit an audio frame to the recording. + + @param recorder + Media recorder. + + @param sampleBuffer + Sample buffer containing a single audio frame. The sample buffer MUST be in 32-bit PCM + format, interleaved by channel for channel counts greater than 1. + + @param sampleCount + Total number of samples in the sample buffer. This should account for multiple channels. + + @param timestamp + Frame timestamp. The spacing between consecutive timestamps determines the + effective framerate of some recorders. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCommitSamples ( + VKTRecorder* recorder, + const float* sampleBuffer, + int32_t sampleCount, + int64_t timestamp +); + +/*! + @function VKTRecorderFinishWriting + + @abstract Finish writing and invoke the completion handler. + + @discussion Finish writing and invoke the completion handler. The recorder is automatically + released, along with any resources it owns. The recorder MUST NOT be used once this function + has been invoked. + + The completion handler will be invoked soon after this function is called. If recording fails for any reason, + the completion handler will receive `NULL` for the recording path. + + @param recorder + Media recorder. + + @param handler + Recording completion handler invoked once recording is completed. + + @param context + Context passed to completion handler. Can be `NULL`. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderFinishWriting ( + VKTRecorder* recorder, + VKTRecordingHandler handler, + void* context +); +#pragma endregion + + +#pragma region --Constructors-- +/*! + @function VKTRecorderCreateMP4 + + @abstract Create an MP4 recorder. + + @discussion Create an MP4 recorder that records with the H.264 AVC codec. + + @param path + Recording path. This path must be writable on the local file system. + + @param width + Video width. + + @param height + Video height. + + @param frameRate + Video frame rate. + + @param sampleRate + Audio sample rate. Pass 0 if recording without audio. + + @param channelCount + Audio channel count. Pass 0 if recording without audio. + + @param videoBitRate + Video bit rate in bits per second. + + @param keyframeInterval + Video keyframe interval in seconds. + + @param audioBitRate + Audio bit rate in bits per second. Ignored if no audio format is provided. + + @param recorder + Output media recorder. + + @returns Recorder creation status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCreateMP4 ( + const char* path, + int32_t width, + int32_t height, + float frameRate, + int32_t sampleRate, + int32_t channelCount, + int32_t videoBitRate, + int32_t keyframeInterval, + int32_t audioBitRate, + VKTRecorder** recorder +); + +/*! + @function VKTRecorderCreateHEVC + + @abstract Create an HEVC recorder. + + @discussion Create an MP4 recorder that records with the H.265 HEVC codec. + + @param path + Recording path. This path must be writable on the local file system. + + @param width + Video width. + + @param height + Video height. + + @param frameRate + Video frame rate. + + @param sampleRate + Audio sample rate. Pass 0 if recording without audio. + + @param channelCount + Audio channel count. Pass 0 if recording without audio. + + @param videoBitRate + Video bit rate in bits per second. + + @param keyframeInterval + Video keyframe interval in seconds. + + @param audioBitRate + Audio bit rate in bits per second. Ignored if no audio format is provided. + + @param recorder + Output media recorder. + + @returns Recorder creation status. + */ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCreateHEVC ( + const char* path, + int32_t width, + int32_t height, + float frameRate, + int32_t sampleRate, + int32_t channelCount, + int32_t videoBitRate, + int32_t keyframeInterval, + int32_t audioBitRate, + VKTRecorder** recorder +); + +/*! + @function VKTRecorderCreateWEBM + + @abstract Create a WEBM video recorder. + + @discussion Create a WEBM video recorder. + + @param path + Recording path. This path must be writable on the local file system. + + @param width + Video width. + + @param height + Vide. height. + + @param frameRate + Video frame rate. + + @param sampleRate + Audio sample rate. Pass 0 if recording without audio. + + @param channelCount + Audio channel count. Pass 0 if recording without audio. + + @param videoBitRate + Video bit rate in bits per second. + + @param keyframeInterval + Video keyframe interval in seconds. + + @param audioBitRate + Audio bit rate in bits per second. Ignored if no audio format is provided. + + @param recorder + Output media recorder. + + @returns Recorder creation status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCreateWEBM ( + const char* path, + int32_t width, + int32_t height, + float frameRate, + int32_t sampleRate, + int32_t channelCount, + int32_t videoBitRate, + int32_t keyframeInterval, + int32_t audioBitRate, + VKTRecorder** recorder +); + +/*! + @function VKTRecorderCreateGIF + + @abstract Create a GIF recorder. + + @discussion Create an animated GIF recorder. + The generated GIF image will loop forever. + + @param path + Recording path. This path must be writable on the local file system. + + @param width + Image width. + + @param height + Image height. + + @param delay + Per-frame delay in seconds. + + @param recorder + Output media recorder. + + @returns Recorder creation status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCreateGIF ( + const char* path, + int32_t width, + int32_t height, + float delay, + VKTRecorder** recorder +); + +/*! + @function VKTRecorderCreateWAV + + @abstract Create a WAV audio recorder. + + @discussion Create a WAV audio recorder. + + @param path + Recording path. This path must be writable on the local file system. + + @param sampleRate + Audio sample rate. + + @param channelCount + Audio channel count. + + @param recorder + Output media recorder. + + @returns Recorder creation status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCreateWAV ( + const char* path, + int32_t sampleRate, + int32_t channelCount, + VKTRecorder** recorder +); + +/*! + @function VKTRecorderCreateJPEG + + @abstract Create a JPEG image sequence recorder. + + @discussion Create a JPEG image sequence recorder. + The recorder returns a path separator-delimited list of image frame paths. + + @param width + Image width. + + @param height + Image height. + + @param compressionQuality + Image compression quality in range [0, 1]. + + @param recorder + Output media recorder. + + @returns Recorder creation status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTRecorderCreateJPEG ( + const char* path, + int32_t width, + int32_t height, + float compressionQuality, + VKTRecorder** recorder +); +#pragma endregion \ No newline at end of file diff --git a/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTSession.h b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTSession.h new file mode 100644 index 0000000..58e4842 --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTSession.h @@ -0,0 +1,92 @@ +// +// VKTSession.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include + +#pragma region --Enumerations-- +/*! + @enum VKTStatus + + @abstract VideoKit status codes. + + @constant VKT_OK + Successful operation. + + @constant VKT_ERROR_INVALID_ARGUMENT + Provided argument is invalid. + + @constant VKT_ERROR_INVALID_OPERATION + Operation is invalid in current state. + + @constant VKT_ERROR_NOT_IMPLEMENTED + Operation has not been implemented. + + @constant VKT_ERROR_INVALID_SESSION + VideoKit session has not been set or is invalid. + + @constant VKT_ERROR_INVALID_PLAN + Current VideoKit plan does not allow the operation. + + @constant VKT_WARNING_LIMITED_PLAN + Current VideoKit plan only allows for limited functionality. + */ +enum VKTStatus { + VKT_OK = 0, + VKT_ERROR_INVALID_ARGUMENT = 1, + VKT_ERROR_INVALID_OPERATION = 2, + VKT_ERROR_NOT_IMPLEMENTED = 3, + VKT_ERROR_INVALID_SESSION = 101, + VKT_ERROR_INVALID_PLAN = 104, + VKT_WARNING_LIMITED_PLAN = 105, +}; +typedef enum VKTStatus VKTStatus; +#pragma endregion + + +#pragma region --Client API-- +/*! + @function VKTGetBundleIdentifier + + @abstract Get the application bundle ID for generating a session token. + + @discussion Get the application bundle ID for generating a session token. + + @param bundle + Destination bundle ID string. + + @returns Operation status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTGetBundleIdentifier (char * bundle); + +/*! + @function VKTGetSessionStatus + + @abstract Get the VideoKit session status. + + @discussion Get the VideoKit session status. + + @returns Session status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTGetSessionStatus (void); + +/*! + @function VKTSetSessionToken + + @abstract Set the VideoKit session token. + + @discussion Set the VideoKit session token. + + @param token + VideoKit session token. + + @returns Session status. +*/ +VKT_BRIDGE VKT_EXPORT VKTStatus VKT_API VKTSetSessionToken (const char* token); +#pragma endregion diff --git a/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTTypes.h b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTTypes.h new file mode 100644 index 0000000..1055b38 --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VKTTypes.h @@ -0,0 +1,29 @@ +// +// VKTTypes.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#ifdef __cplusplus + #define VKT_BRIDGE extern "C" +#else + #define VKT_BRIDGE extern +#endif + +#ifdef _WIN64 + #define VKT_EXPORT __declspec(dllexport) +#else + #define VKT_EXPORT +#endif + +#ifdef __EMSCRIPTEN__ + #define VKT_API EMSCRIPTEN_KEEPALIVE +#elif defined(_WIN64) + #define VKT_API APIENTRY +#else + #define VKT_API +#endif \ No newline at end of file diff --git a/Plugins/macOS/VideoKit.bundle/Contents/Headers/VideoKit.h b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VideoKit.h new file mode 100644 index 0000000..87a72d3 --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/Headers/VideoKit.h @@ -0,0 +1,17 @@ +// +// VideoKit.h +// VideoKit +// +// Created by Yusuf Olokoba on 5/15/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#pragma once + +#include +#include +#include +#include +#include +#include +#include \ No newline at end of file diff --git a/Plugins/macOS/VideoKit.bundle/Contents/Info.plist b/Plugins/macOS/VideoKit.bundle/Contents/Info.plist new file mode 100644 index 0000000..daaa50a --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/Info.plist @@ -0,0 +1,46 @@ + + + + + BuildMachineOSBuild + 23A344 + CFBundleDevelopmentRegion + en + CFBundleExecutable + VideoKit + CFBundleIdentifier + ai.natml.videokit + CFBundleInfoDictionaryVersion + 6.0 + CFBundleName + VideoKit + CFBundlePackageType + BNDL + CFBundleShortVersionString + 1.0 + CFBundleSupportedPlatforms + + MacOSX + + CFBundleVersion + 1 + DTCompiler + com.apple.compilers.llvm.clang.1_0 + DTPlatformBuild + + DTPlatformName + macosx + DTPlatformVersion + 14.0 + DTSDKBuild + 23A334 + DTSDKName + macosx14.0 + DTXcode + 1501 + DTXcodeBuild + 15A507 + LSMinimumSystemVersion + 10.15 + + diff --git a/Plugins/macOS/VideoKit.bundle/Contents/MacOS/VideoKit b/Plugins/macOS/VideoKit.bundle/Contents/MacOS/VideoKit new file mode 100755 index 0000000..2ba6cc4 Binary files /dev/null and b/Plugins/macOS/VideoKit.bundle/Contents/MacOS/VideoKit differ diff --git a/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTAudioDevice.h b/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTAudioDevice.h new file mode 100644 index 0000000..fc0601d --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTAudioDevice.h @@ -0,0 +1,18 @@ +// +// VKTAudioDevice.h +// VideoKit +// +// Created by Yusuf Olokoba on 6/14/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#import "VKTMediaDevice.h" + +@interface VKTAudioDevice : NSObject +// Introspection +- (nonnull instancetype) initWithDevice:(nonnull AVCaptureDevice*) device; +@property (readonly, nonnull) AVCaptureDevice* device; +// Settings +@property int sampleRate; +@property int channelCount; +@end diff --git a/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTAudioDeviceBuffer.h b/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTAudioDeviceBuffer.h new file mode 100644 index 0000000..2bd25a3 --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTAudioDeviceBuffer.h @@ -0,0 +1,15 @@ +// +// VKTAudioDeviceBuffer.h +// VideoKit +// +// Created by Yusuf Olokoba on 6/14/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +@import AVFoundation; + +@interface VKTAudioDeviceBuffer : NSObject +@property (readonly, nonnull) CMSampleBufferRef sampleBuffer; +- (nonnull instancetype) initWithSampleBuffer:(nonnull CMSampleBufferRef) sampleBuffer; +@end + diff --git a/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTCameraDevice.h b/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTCameraDevice.h new file mode 100644 index 0000000..01ec674 --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTCameraDevice.h @@ -0,0 +1,19 @@ +// +// VKTCameraDevice.h +// VideoKit +// +// Created by Yusuf Olokoba on 6/14/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +#import "VKTMediaDevice.h" + +@interface VKTCameraDevice : NSObject +// Introspection +- (nonnull instancetype) initWithDevice:(nonnull AVCaptureDevice*) device; +@property (readonly, nonnull) AVCaptureDevice* device; +// Settings +@property CGSize previewResolution; +@property int frameRate; +- (void) capturePhoto:(nonnull VKTSampleBufferBlock) photoBufferBlock; +@end diff --git a/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTCameraDeviceImage.h b/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTCameraDeviceImage.h new file mode 100644 index 0000000..9fe4e2a --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTCameraDeviceImage.h @@ -0,0 +1,20 @@ +// +// VKTCameraDeviceImage.h +// VideoKit +// +// Created by Yusuf Olokoba on 6/14/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +@import AVFoundation; + +@interface VKTCameraDeviceImage : NSObject +@property (readonly, nonnull) CVPixelBufferRef pixelBuffer; +@property (readonly) UInt64 timestamp; +@property (readonly) bool verticallyMirrored; +@property (readonly) bool hasIntrinsicMatrix; +@property (readonly) matrix_float3x3 intrinsicMatrix; +@property (readonly, nonnull) NSDictionary* metadata; +- (nonnull instancetype) initWithSampleBuffer:(nonnull CMSampleBufferRef) sampleBuffer andMirror:(bool) mirror; +- (nonnull instancetype) initWithPhoto:(nonnull AVCapturePhoto*) photo andMirror:(bool) mirror; +@end diff --git a/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTMediaDevice.h b/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTMediaDevice.h new file mode 100644 index 0000000..3ae06a7 --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/PrivateHeaders/VKTMediaDevice.h @@ -0,0 +1,26 @@ +// +// VKTMediaDevice.h +// VideoKit +// +// Created by Yusuf Olokoba on 6/14/2023. +// Copyright © 2023 NatML Inc. All rights reserved. +// + +@import AVFoundation; +#include + +@protocol VKTMediaDevice; + +typedef void (^VKTSampleBufferBlock) (id _Nonnull sampleBuffer); +typedef void (^VKTDeviceDisconnectBlock) (id _Nonnull device); + +@protocol VKTMediaDevice +@required +@property (readonly, nonnull) NSString* uniqueID; +@property (readonly, nonnull) NSString* name; +@property (readonly) VKTDeviceFlags flags; +@property (readonly) bool running; +@property (nullable) VKTDeviceDisconnectBlock disconnectHandler; +- (void) startRunning:(nonnull VKTSampleBufferBlock) sampleBufferHandler; +- (void) stopRunning; +@end diff --git a/Plugins/macOS/VideoKit.bundle/Contents/_CodeSignature/CodeResources b/Plugins/macOS/VideoKit.bundle/Contents/_CodeSignature/CodeResources new file mode 100644 index 0000000..176dc46 --- /dev/null +++ b/Plugins/macOS/VideoKit.bundle/Contents/_CodeSignature/CodeResources @@ -0,0 +1,200 @@ + + + + + files + + files2 + + Headers/VKTCamera.h + + hash2 + + QDNlYhzY71qVfsgrVMc590iIALxzagieexXqeX1cwVE= + + + Headers/VKTDevice.h + + hash2 + + HEctu1Mkmk2c9j+s4aTht6tKygw02IZzeYe5LDmbAow= + + + Headers/VKTMicrophone.h + + hash2 + + GYR0cDjWVS5wbGnL6biA+yjgRyTY0O2HCwtSrPtv1Xs= + + + Headers/VKTRecorder.h + + hash2 + + RNY4XIfjmnbqVOf6Lxf4v2qa0t7eU6gCkcb/LVIMxO8= + + + Headers/VKTSession.h + + hash2 + + 9cQl1w3DyMIHyOqUFWJHjIX4WWuZjcYAL154T0l7mX0= + + + Headers/VKTTypes.h + + hash2 + + 3sDMMFDxmdYPLE5q3sBufoBDywKCm28DliwJ4ej0Q28= + + + Headers/VideoKit.h + + hash2 + + xZVtuzZG7rVNGoAqwRQkMMTU3jNu39NvRU7IWHflV/Q= + + + PrivateHeaders/VKTAudioDevice.h + + hash2 + + ZiTtr5jnajsASNpcE3NxwSZQz8cBgei2kiqQkI6qA5c= + + + PrivateHeaders/VKTAudioDeviceBuffer.h + + hash2 + + GgVnO59D33m+rJmcXWekOZetHubP6AXWDdb8sv4XBRw= + + + PrivateHeaders/VKTCameraDevice.h + + hash2 + + p/sWrTLCfOTqLAeD0o2GtWLr8DhrR85eSJCVeT/878M= + + + PrivateHeaders/VKTCameraDeviceImage.h + + hash2 + + QujiIG3I+wAgxwwcbtEAcyk1ao/mZ3XkxRfRVpMhj2w= + + + PrivateHeaders/VKTMediaDevice.h + + hash2 + + 7GqwApL8+iV8cp/1tCrFiwJx2J02pGIrgf3cfGxNBCI= + + + + rules + + ^Resources/ + + ^Resources/.*\.lproj/ + + optional + + weight + 1000 + + ^Resources/.*\.lproj/locversion.plist$ + + omit + + weight + 1100 + + ^Resources/Base\.lproj/ + + weight + 1010 + + ^version.plist$ + + + rules2 + + .*\.dSYM($|/) + + weight + 11 + + ^(.*/)?\.DS_Store$ + + omit + + weight + 2000 + + ^(Frameworks|SharedFrameworks|PlugIns|Plug-ins|XPCServices|Helpers|MacOS|Library/(Automator|Spotlight|LoginItems))/ + + nested + + weight + 10 + + ^.* + + ^Info\.plist$ + + omit + + weight + 20 + + ^PkgInfo$ + + omit + + weight + 20 + + ^Resources/ + + weight + 20 + + ^Resources/.*\.lproj/ + + optional + + weight + 1000 + + ^Resources/.*\.lproj/locversion.plist$ + + omit + + weight + 1100 + + ^Resources/Base\.lproj/ + + weight + 1010 + + ^[^/]+$ + + nested + + weight + 10 + + ^embedded\.provisionprofile$ + + weight + 20 + + ^version\.plist$ + + weight + 20 + + + + diff --git a/README.md b/README.md new file mode 100644 index 0000000..ed8aa7f --- /dev/null +++ b/README.md @@ -0,0 +1,152 @@ +# VideoKit +VideoKit is the only full feature user-generated content solution for Unity Engine. VideoKit allows: + +- **Video Recording**. Record MP4 videos, animated GIF images, audio files, and more in as little as zero lines of code. Simply drop a component in your scene, and setup buttons to start and stop recording. + +- **Interactive Video Effects**. Build TikTok and Snapchat-style video effects which leverage hardware machine learning, including color grading, human segmentation, face filters, and much more. + +- **Seamless Video Editing**. Create video editing user flows with support for slicing videos, combining videos, extracting thumbnails, and more. + +- **Extensive Camera Control**. Build full-featured camera apps with focus controls, flash, manual exposure, white balance, and more. + +- **Audio Captioning**. Add audio-based user experiences in your application with audio captioning (speech-to-text). + +- **Text Commands**. Convert any natural language prompt to a `struct` for building text and voice command functionality. + +- **Social Sharing**. Power your app's virality engine by enabling your users to share their user-generated content in your app. + +- **Gallery Picking**. Pick images and videos from the camera roll to build highly personalized content flows. + +## Installing VideoKit +Add the following items to your Unity project's `Packages/manifest.json`: +```json +{ + "scopedRegistries": [ + { + "name": "NatML", + "url": "https://registry.npmjs.com", + "scopes": ["ai.natml", "ai.fxn"] + } + ], + "dependencies": { + "ai.natml.videokit": "0.0.17", + } +} +``` + +> VideoKit is still in alpha. As such, behaviours are expected to change more drastically between releases. + +## Retrieving your Access Key +To use VideoKit, you will need to generate an access key. First, head over to [hub.natml.ai](https://hub.natml.ai/account/developers) to create an account by logging in. Once you do, generate an access key: + +![generating an access key](https://raw.githubusercontent.com/natmlx/videokit/main/Media/create-access-key.gif) + +Then add the key to your Unity project in `Project Settings > VideoKit`: + +![set the access key](https://raw.githubusercontent.com/natmlx/videokit/main/Media/set-access-key.gif) + +## Using VideoKit +Here are a few things you can do with VideoKit: + +### Social Sharing +Share images, audio, and video files with the native share sheet with the `MediaAsset.Share` method: +```csharp +Texture2D image = ... +ImageAsset asset = await MediaAsset.FromTexture(image); +string receiverAppId = await asset.Share(); +``` + +### Saving to the Camera Roll +Save images and videos to the camera roll with the `MediaAsset.SaveToCameraRoll` method: +```csharp +Texture2D image = ... +ImageAsset asset = await MediaAsset.FromTexture(image); +bool saved = await asset.SaveToCameraRoll(); +``` + +### Picking from the Camera Roll +Pick images and videos from the camera roll with the `MediaAsset.FromCameraRoll` method: +```csharp +// This will present the native gallery UI +var asset = await MediaAsset.FromCameraRoll() as ImageAsset; +Texture2D image = await asset.ToTexture(); +// Do stuff with `image`... +``` + +### Camera Streaming +Stream the camera preview with the `VideoKitCameraManager` component: + +![stream the camera preview](https://raw.githubusercontent.com/natmlx/videokit/main/Media/camera-streaming.gif) + +### Record Videos +Record MP4, HEVC, WEBM videos; animated GIF images; JPEG image sequences; and WAV audio files with the `VideoKitRecorder` component: + +![recording a video](https://raw.githubusercontent.com/natmlx/videokit/main/Media/video-recording.gif) + +> Video recording requires an active [VideoKit](https://hub.natml.ai/account/billing) subscription. But you can record `MP4` videos that are downsized and watermarked without a subscription. + +### Human Texture +Remove the background from the camera preview with the `VideoKitCameraManager` component: + +![using the human texture](https://raw.githubusercontent.com/natmlx/videokit/main/Media/human-texture.gif) + +> Human texture requires an active [VideoKit AI](https://hub.natml.ai/account/billing) subscription. + +### Speech-to-Text +Caption audio with the `AudioAsset.Caption` method: +```csharp +AudioClip clip = ...; +var asset = await MediaAsset.FromAudioClip(clip); +var caption = await asset.Caption(); +Debug.Log(caption); +``` + +> Audio captioning requires an active [VideoKit AI](https://hub.natml.ai/account/billing) subscription. + +### Text Commands +Convert a natural language prompt into a `struct` with the `TextAsset.To` method. This enables features like text commands, and can be combined with audio captioning for voice control: +```csharp +using System.ComponentModel; // for `DescriptionAttribute` +using VideoKit.Assets; + +struct Command { // Define this however you want + + [Description(@"The user's name")] + public string name; + + [Description(@"The user's age")] + public int age; +} + +async void ParseCommand () { + var prompt = "My name is Jake and I'm thirteen years old."; + var asset = await MediaAsset.FromText(prompt); + var command = await asset.To(); + // command = { "name": "Jake", "age": 13 } +} +``` + +> Text commands can be used without a subscription, but will require an active [VideoKit AI](https://hub.natml.ai/account/billing) subscription in a later VideoKit release. + +___ + +## Requirements +- Unity 2022.3+ + +## Supported Platforms +- Android API Level 24+ +- iOS 13+ +- macOS 10.15+ (Apple Silicon and Intel) +- Windows 10+ (64-bit only) +- WebGL: + - Chrome 91+ + - Firefox 90+ + - Safari 16.4+ + +## Resources +- Join the [NatML community on Discord](https://natml.ai/community). +- See the [VideoKit documentation](https://docs.natml.ai/videokit). +- Check out [NatML on GitHub](https://github.com/natmlx). +- Contact us at [hi@natml.ai](mailto:hi@natml.ai). + +Thank you very much! \ No newline at end of file diff --git a/README.md.meta b/README.md.meta new file mode 100644 index 0000000..dee706a --- /dev/null +++ b/README.md.meta @@ -0,0 +1,7 @@ +fileFormatVersion: 2 +guid: 491c4a13c4bac46449760e43196d30a1 +TextScriptImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Resources.meta b/Resources.meta new file mode 100644 index 0000000..9186414 --- /dev/null +++ b/Resources.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: afaff3737d93646ebb9cc6d59f7c9d39 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Resources/RenderTextureOutput.compute b/Resources/RenderTextureOutput.compute new file mode 100644 index 0000000..103f7be --- /dev/null +++ b/Resources/RenderTextureOutput.compute @@ -0,0 +1,133 @@ +/* +* VideoKit +* Copyright (c) 2023 NatML Inc. All Rights Reserved. +*/ + +#pragma kernel ConvertYUV420 +#pragma kernel ConvertBGRA8888 +#pragma kernel ConvertRGBA8888 +#pragma kernel Rotate90 +#pragma kernel Rotate180 +#pragma kernel Rotate270 + +ByteAddressBuffer Input; +Texture2D Image; +int4 Offset; // [Y,Cb,Cr] +int4 Stride; // [Y_row, Cb_row, Y_pixel, Cb_pixel] +bool Mirror; +RWTexture2D Result; + +static const float3x3 YUV2RGB = float3x3( // https://www.mir.com/DMG/ycbcr.html + 1., 0., 1.402, + 1., -0.344136, -0.714136, + 1., 1.772, 0. +); + +int ReadByte (int index) { + int address = (index >> 2) << 2; + int offset = index - address; + int word = Input.Load(address); + int data = word >> (offset * 8); + return data & 0xFF; +} + +[numthreads(16, 16, 1)] +void ConvertYUV420 (uint3 id : SV_DispatchThreadID) { + // Check + uint width, height; + Result.GetDimensions(width, height); + if (id.x >= width) + return; + if (id.y >= height) + return; + // Color + uint j = Mirror ? height - id.y - 1 : id.y; + float y = ReadByte(Offset.x + j * Stride.x + id.x * Stride.z); + float u = ReadByte(Offset.y + j / 2 * Stride.y + id.x / 2 * Stride.w) - 128.; + float v = ReadByte(Offset.z + j / 2 * Stride.y + id.x / 2 * Stride.w) - 128.; + float3 yuv = float3(y, u, v) / 255.; + float3 rgb = mul(YUV2RGB, yuv); + Result[id.xy] = float4(rgb, 1.0); +} + +[numthreads(16, 16, 1)] +void ConvertRGBA8888 (uint3 id : SV_DispatchThreadID) { + // Check + uint width, height; + Result.GetDimensions(width, height); + if (id.x >= width) + return; + if (id.y >= height) + return; + // Color + uint j = Mirror ? height - id.y - 1 : id.y; + int pixel = Input.Load(j * Stride.x + id.x * 4); + float r = pixel & 0xFF; + float g = (pixel >> 8) & 0xFF; + float b = (pixel >> 16) & 0xFF; + float a = (pixel >> 24) & 0xFF; + float4 rgba = float4(r, g, b, a) / 255.; + Result[id.xy] = rgba; +} + +[numthreads(16, 16, 1)] +void ConvertBGRA8888 (uint3 id : SV_DispatchThreadID) { + // Check + uint width, height; + Result.GetDimensions(width, height); + if (id.x >= width) + return; + if (id.y >= height) + return; + // Color + uint j = Mirror ? height - id.y - 1 : id.y; + int pixel = Input.Load(j * Stride.x + id.x * 4); + float b = pixel & 0xFF; + float g = (pixel >> 8) & 0xFF; + float r = (pixel >> 16) & 0xFF; + float a = (pixel >> 24) & 0xFF; + float4 rgba = float4(r, g, b, a) / 255.; + Result[id.xy] = rgba; +} + +[numthreads(16, 16, 1)] +void Rotate90 (uint3 id : SV_DispatchThreadID) { + // Check + uint width, height; + Result.GetDimensions(width, height); + if (id.x >= width) + return; + if (id.y >= height) + return; + // Copy + uint2 uv = uint2(height - id.y - 1, id.x); + Result[id.xy] = Image[uv]; +} + +[numthreads(16, 16, 1)] +void Rotate180 (uint3 id : SV_DispatchThreadID) { + // Check + uint width, height; + Result.GetDimensions(width, height); + if (id.x >= width) + return; + if (id.y >= height) + return; + // Copy + uint2 uv = uint2(width - id.x - 1, height - id.y - 1); + Result[id.xy] = Image[uv]; +} + +[numthreads(16, 16, 1)] +void Rotate270 (uint3 id : SV_DispatchThreadID) { + // Check + uint width, height; + Result.GetDimensions(width, height); + if (id.x >= width) + return; + if (id.y >= height) + return; + // Copy + uint2 uv = uint2(id.y, width - id.x - 1); + Result[id.xy] = Image[uv]; +} diff --git a/Resources/RenderTextureOutput.compute.meta b/Resources/RenderTextureOutput.compute.meta new file mode 100644 index 0000000..16400e8 --- /dev/null +++ b/Resources/RenderTextureOutput.compute.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 401c9a92f0b194409872fa25ce43e53e +ComputeShaderImporter: + externalObjects: {} + preprocessorOverride: 0 + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime.meta b/Runtime.meta new file mode 100644 index 0000000..f47b49a --- /dev/null +++ b/Runtime.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 953d0e2791f2f4d1ab1ac502005e8dd9 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/AI.meta b/Runtime/AI.meta new file mode 100644 index 0000000..0797d37 --- /dev/null +++ b/Runtime/AI.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: e87b192c264814673a8004b40f392575 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/AI/MatteKitPredictor.cs b/Runtime/AI/MatteKitPredictor.cs new file mode 100644 index 0000000..f20abdd --- /dev/null +++ b/Runtime/AI/MatteKitPredictor.cs @@ -0,0 +1,131 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.AI { + + using System; + using System.Threading.Tasks; + using UnityEngine; + using Unity.Collections.LowLevel.Unsafe; + using NatML; + using NatML.Features; + using NatML.Internal; + using NatML.Types; + using Internal; + + /// + /// MatteKit human texture predictor. + /// + public sealed class MatteKitPredictor : IMLPredictor { + + #region --Client API-- + /// + /// MatteKit graph variant. + /// + public enum Variant : int { + /// + /// Default variant. + /// This will choose a landscape or portrait graph depending on the screen orientation. + /// + Default = 0, + /// + /// Landscape HD graph. + /// + _1280x720 = 1, + /// + /// Portrait HD graph. + /// + _720x1280 = 2 + } + + /// + /// Human texture. + /// Access this after calling `Predict` with an image feature. + //// + public readonly Texture2D humanTexture; + + /// + /// Predict the human texture from a given image. + /// + /// Image feature. + /// Null. Access the `humanTexture` or `backgroundTexture` instead. + public unsafe object Predict (params MLFeature[] features) { + // Predict + var inputType = model.inputs[0] as MLImageType; + using var inputFeature = (features[0] as IMLEdgeFeature).Create(inputType); + using var outputFeatures = model.Predict(inputFeature); + // Resize + var humanFeature = new MLArrayFeature(outputFeatures[0]); + var width = humanFeature.shape[2]; + var height = humanFeature.shape[1]; + if (humanTexture.width != width || humanTexture.height != height) + humanTexture.Reinitialize(width, height); + // Copy + var elementsPerRow = width * 4; + var bytesPerRow = elementsPerRow * sizeof(float); + var dst = (float*)humanTexture.GetRawTextureData().GetUnsafePtr(); + fixed (float* src = humanFeature) + UnsafeUtility.MemCpyStride( + dst, + bytesPerRow, + src + (height - 1) * elementsPerRow, + -bytesPerRow, + bytesPerRow, + height + ); + humanTexture.Apply(); + // Return + return default; + } + + /// + /// Dispose the predictor and release resources. + /// + public void Dispose () { + model.Dispose(); + Texture2D.Destroy(humanTexture); + } + + /// + /// Create the MatteKit predictor. + /// NOTE: This requires an active VideoKit AI plan. + /// + /// MatteKit graph variant. + /// MatteKit model configuration. + /// MatteKit predictor. + public static async Task Create ( + Variant variant = Variant.Default, + MLEdgeModel.Configuration configuration = null + ) { + variant = variant == Variant.Default ? GetDefaultVariant() : variant; + var tag = GetTag(variant); + var model = await MLEdgeModel.Create(tag, configuration, VideoKitSettings.Instance.natml); + var predictor = new MatteKitPredictor(model); + return predictor; + } + #endregion + + + #region --Operations-- + private readonly MLEdgeModel model; + + private MatteKitPredictor (MLEdgeModel model) { + this.model = model; + this.humanTexture = new Texture2D(16, 16, TextureFormat.RGBAFloat, false); + } + + private static string GetTag (Variant variant) => variant switch { + Variant._1280x720 => "@videokit/mattekit@1280x720-ne", + Variant._720x1280 => "@videokit/mattekit@720x1280-ne", + _ => null, + }; + + private static Variant GetDefaultVariant () => + Array.IndexOf(new [] { RuntimePlatform.Android, RuntimePlatform.IPhonePlayer }, Application.platform) >= 0 && + Array.IndexOf(new [] { ScreenOrientation.Portrait, ScreenOrientation.PortraitUpsideDown }, Screen.orientation) >= 0 ? + Variant._720x1280 : Variant._1280x720; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/AI/MatteKitPredictor.cs.meta b/Runtime/AI/MatteKitPredictor.cs.meta new file mode 100644 index 0000000..1912ec5 --- /dev/null +++ b/Runtime/AI/MatteKitPredictor.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: b9f193618c24744cca335318de20a2f8 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Assets.meta b/Runtime/Assets.meta new file mode 100644 index 0000000..54e00a1 --- /dev/null +++ b/Runtime/Assets.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 50a8b6a722ab14441bcedcd88c9261f9 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Assets/AudioAsset.cs b/Runtime/Assets/AudioAsset.cs new file mode 100644 index 0000000..66ae9eb --- /dev/null +++ b/Runtime/Assets/AudioAsset.cs @@ -0,0 +1,81 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Assets { + + using System; + using System.IO; + using System.Threading.Tasks; + using Function.Types; + using Internal; + + /// + /// Audio asset. + /// + public sealed class AudioAsset : MediaAsset { + + #region --Client API-- + /// + /// Audio sample rate. + /// + public readonly int sampleRate; + + /// + /// Audio channel count. + /// + public readonly int channelCount; + + /// + /// Audio duration in seconds. + /// + public readonly float duration; + + /// + /// Generate captions for the audio asset. + /// NOTE: This requires an active VideoKit AI plan. + /// + public async Task Caption () { + // Caption + using var stream = OpenReadStream(); + var prediction = await VideoKitSettings.Instance.fxn.Predictions.Create( + "@videokit/caption-v0-1", + inputs: new () { [@"audio"] = stream } + ) as CloudPrediction; + // Check + if (!string.IsNullOrEmpty(prediction.error)) + throw new InvalidOperationException(prediction.error); + // Return + var result = prediction.results[0] as string; + return result; + } + + /// + public override async Task ToValue (int minUploadSize = 4096) { // DEPLOY + using var stream = OpenReadStream(); + var name = Path.GetFileName(path); + var value = await VideoKitSettings.Instance.fxn.Predictions.ToValue( + stream, + name, + Dtype.Audio, + minUploadSize: minUploadSize + ); + return value; + } + #endregion + + + #region --Operations-- + + internal AudioAsset (string path, int sampleRate, int channelCount, float duration) { + this.path = path; + this.sampleRate = sampleRate; + this.channelCount = channelCount; + this.duration = duration; + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Assets/AudioAsset.cs.meta b/Runtime/Assets/AudioAsset.cs.meta new file mode 100644 index 0000000..fd773be --- /dev/null +++ b/Runtime/Assets/AudioAsset.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 3184d2d875f1a45aa9caa2e72289a460 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Assets/ImageAsset.cs b/Runtime/Assets/ImageAsset.cs new file mode 100644 index 0000000..3ab191b --- /dev/null +++ b/Runtime/Assets/ImageAsset.cs @@ -0,0 +1,76 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Assets { + + using System; + using System.IO; + using System.Text; + using System.Threading.Tasks; + using UnityEngine; + using UnityEngine.Networking; + using Function.Types; + using Internal; + + /// + /// Image asset. + /// + public sealed class ImageAsset : MediaAsset { + + #region --Client API-- + /// + /// Image width. + /// + public readonly int width; + + /// + /// Image height. + /// + public readonly int height; + + /// + /// Load the image asset into a texture. + /// + public async Task ToTexture () { + var uri = path[0] == '/' ? $"file://{path}" : path; + using var request = UnityWebRequestTexture.GetTexture(uri); + request.SendWebRequest(); + while (!request.isDone) + await Task.Yield(); + if (request.result != UnityWebRequest.Result.Success) { + Debug.LogError($"VideoKit: Image asset failed to create texture with error: {request.error}"); + return null; + } + var texture = DownloadHandlerTexture.GetContent(request); + return texture; + } + + /// + public override async Task ToValue (int minUploadSize = 4096) { // DEPLOY + using var stream = OpenReadStream(); + var name = Path.GetFileName(path); + var value = await VideoKitSettings.Instance.fxn.Predictions.ToValue( + stream, + name, + Dtype.Image, + minUploadSize: minUploadSize + ); + return value; + } + #endregion + + + #region --Operations-- + + internal ImageAsset (string path, int width, int height) { + this.path = path; + this.width = width; + this.height = height; + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Assets/ImageAsset.cs.meta b/Runtime/Assets/ImageAsset.cs.meta new file mode 100644 index 0000000..e2665b0 --- /dev/null +++ b/Runtime/Assets/ImageAsset.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 98d1ca3adbd444cda81d3400c04112fe +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Assets/MediaAsset.cs b/Runtime/Assets/MediaAsset.cs new file mode 100644 index 0000000..fb7a8b7 --- /dev/null +++ b/Runtime/Assets/MediaAsset.cs @@ -0,0 +1,282 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Assets { + + using AOT; + using System; + using System.IO; + using System.Linq; + using System.Runtime.InteropServices; + using System.Text; + using System.Text.RegularExpressions; + using System.Threading.Tasks; + using UnityEngine; + using UnityEngine.Android; + using Function.Types; + using Internal; + using Recorders; + using static Internal.VideoKit; + + /// + /// Media asset. + /// + public abstract class MediaAsset { + + #region --Client API-- + /// + /// Path to media asset. + /// + public string path { get; protected set; } + + /// + /// Convert the media asset to a Function value. + /// + /// Media assets larger than this size in bytes will be uploaded. + /// Function value for making predictions. + public abstract Task ToValue (int minUploadSize = 4096); + + /// + /// Share the media asset using the native sharing UI. + /// + /// Optional message to share with the media asset. + /// Receiving app bundle ID or `null` if the user did not complete the share action. + public virtual Task Share (string? message = null) { + var tcs = new TaskCompletionSource(); + var handle = GCHandle.Alloc(tcs, GCHandleType.Normal); + try { + VideoKit.ShareAsset(path, message, OnShare, (IntPtr)handle).CheckStatus(); + } catch (Exception ex) { + handle.Free(); + tcs.SetException(ex); + } + return tcs.Task; + } + + /// + /// Save the media asset to the camera roll. + /// + /// Optional album to save media asset to. + /// Whether the asset was successfully saved to the camera roll. + public virtual Task SaveToCameraRoll (string? album = null) { + var tcs = new TaskCompletionSource(); + var handle = GCHandle.Alloc(tcs, GCHandleType.Normal); + try { + VideoKit.SaveAssetToCameraRoll(path, album, OnSaveToCameraRoll, (IntPtr)handle).CheckStatus(); + } catch (Exception ex) { + handle.Free(); + tcs.SetException(ex); + } + return tcs.Task; + } + + /// + /// Delete the media asset on disk. + /// + public virtual void Delete () { // INCOMPLETE // Handle WebGL + if (Application.platform != RuntimePlatform.WebGLPlayer) + File.Delete(path); + } + #endregion + + + #region --Creators-- + /// + /// Create a media aseet from a file. + /// + /// Path to media file. + /// Media asset. + public static Task FromFile (string path) { + // Check sequence + var paths = Path.PathSeparator == ':' ? + Regex.Split(path, @"(? 1) + return FromSequence(paths); + // Load + var tcs = new TaskCompletionSource(); + var handle = GCHandle.Alloc(tcs, GCHandleType.Normal); + try { + VideoKit.LoadAsset(path, OnLoad, (IntPtr)handle).CheckStatus(); + } catch (Exception ex) { + handle.Free(); + tcs.SetException(ex); + } + // Return + return tcs.Task; + } + + /// + /// Create a media aseet from a texture. + /// + /// Texture. + /// Image asset. + public static Task FromTexture (Texture2D texture) { + // Check + if (!texture.isReadable) + return Task.FromException(new ArgumentException(@"Cannot create media asset from texture that is not readable")); + // Write to file + string path = null; + var encoded = texture.EncodeToPNG(); + if (Application.platform == RuntimePlatform.WebGLPlayer) { + var sb = new StringBuilder(2048); + VideoKitExt.WriteImage(encoded, encoded.Length, sb, sb.Capacity).CheckStatus(); + path = sb.ToString(); + } else { + var name = Guid.NewGuid().ToString("N"); + path = Path.Combine(Application.temporaryCachePath, $"{name}.png"); + File.WriteAllBytes(path, encoded); + } + // Create asset + var asset = new ImageAsset(path, texture.width, texture.height); + // Return + return Task.FromResult(asset); + } + + /// + /// Create a media aseet from plain text. + /// + /// Text. + /// Text asset. + public static Task FromText (string text) => Task.FromResult(new TextAsset(text)); + + /// + /// Create a media aseet from an audio clip. + /// + /// Audio clip. + /// Audio asset. + public static async Task FromAudioClip (AudioClip clip) { + var sampleBuffer = new float[clip.samples * clip.channels]; + var recorder = await MediaRecorder.Create(MediaFormat.WAV, sampleRate: clip.frequency, channelCount: clip.channels); + clip.GetData(sampleBuffer, 0); + recorder.CommitSamples(sampleBuffer, 0L); + var path = await recorder.FinishWriting(); + var asset = new AudioAsset(path, clip.frequency, clip.channels, clip.length); + return asset; + } + + /// + /// Create a media asset by prompting the user to select an image or video from the camera roll. + /// NOTE: This requires iOS 14+. + /// + /// Media asset. + public static Task FromCameraRoll () where T : MediaAsset { + var type = GetAssetType(); + var tcs = new TaskCompletionSource(); + var handle = GCHandle.Alloc(tcs, GCHandleType.Normal); + try { + VideoKit.LoadAssetFromCameraRoll(type, OnLoad, (IntPtr)handle).CheckStatus(); + } catch (Exception ex) { + handle.Free(); + tcs.SetException(ex); + } + return tcs.Task; + } + #endregion + + + #region --Operations-- + + protected Stream OpenReadStream () { // INCOMPLETE // WebGL support + if (Application.platform == RuntimePlatform.WebGLPlayer) + return null; + return new FileStream(path, FileMode.Open, FileAccess.Read); + } + + private static async Task FromSequence (string[] paths) { + var path = string.Join(Path.PathSeparator, paths); + var assets = await Task.WhenAll(paths.Select(FromFile)); + var sequence = new MediaSequenceAsset(path, assets); + return sequence; + } + + [MonoPInvokeCallback(typeof(VideoKit.AssetLoadHandler))] + private static void OnLoad ( + IntPtr context, + IntPtr rawPath, + AssetType type, + int width, + int height, + float frameRate, + int sampleRate, + int channelCount, + float duration + ) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + // Get tcs + TaskCompletionSource tcs; + try { + var handle = (GCHandle)context; + tcs = handle.Target as TaskCompletionSource; + handle.Free(); + } catch (Exception ex) { + Debug.LogException(ex); + return; + } + // Invoke + var path = Marshal.PtrToStringUTF8(rawPath); + if (type == AssetType.Image) + tcs?.SetResult(new ImageAsset(path, width, height)); + else if (type == AssetType.Audio) + tcs?.SetResult(new AudioAsset(path, sampleRate, channelCount, duration)); + else if (type == AssetType.Video) + tcs?.SetResult(new VideoAsset(path, width, height, frameRate, sampleRate, channelCount, duration)); + else + tcs?.SetResult(null); // any errors get logged natively + } + + [MonoPInvokeCallback(typeof(AssetShareHandler))] + private static void OnShare (IntPtr context, IntPtr receiver) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + // Get tcs + TaskCompletionSource tcs; + try { + var handle = (GCHandle)context; + tcs = handle.Target as TaskCompletionSource; + handle.Free(); + } catch (Exception ex) { + Debug.LogException(ex); + return; + } + // Invoke + var result = Marshal.PtrToStringUTF8(receiver); + tcs?.SetResult(result); + } + + [MonoPInvokeCallback(typeof(AssetShareHandler))] + private static void OnSaveToCameraRoll (IntPtr context, IntPtr receiver) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + // Get tcs + TaskCompletionSource tcs; + try { + var handle = (GCHandle)context; + tcs = handle.Target as TaskCompletionSource; + handle.Free(); + } catch (Exception ex) { + Debug.LogException(ex); + return; + } + // Invoke + var result = receiver != IntPtr.Zero; + tcs?.SetResult(result); + } + + private static AssetType GetAssetType() where T : MediaAsset => typeof(T) switch { + var t when t == typeof(ImageAsset) => VideoKit.AssetType.Image, + var t when t == typeof(AudioAsset) => VideoKit.AssetType.Audio, + var t when t == typeof(VideoAsset) => VideoKit.AssetType.Video, + _ => VideoKit.AssetType.Unknown, + }; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Assets/MediaAsset.cs.meta b/Runtime/Assets/MediaAsset.cs.meta new file mode 100644 index 0000000..a92bae4 --- /dev/null +++ b/Runtime/Assets/MediaAsset.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 7cb0c934cf1d244caa5e5e02237200d2 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Assets/MediaSequenceAsset.cs b/Runtime/Assets/MediaSequenceAsset.cs new file mode 100644 index 0000000..46fcd60 --- /dev/null +++ b/Runtime/Assets/MediaSequenceAsset.cs @@ -0,0 +1,65 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Assets { + + using System; + using System.IO; + using System.Text; + using System.Threading.Tasks; + using UnityEngine; + using UnityEngine.Networking; + using Function.Types; + using Internal; + + /// + /// Media sequence which contains a collection of media assets. + /// + public sealed class MediaSequenceAsset : MediaAsset { + + #region --Client API-- + /// + /// Media assets within the sequence. + /// + public readonly MediaAsset[] assets; + + /// + public override async Task ToValue (int minUploadSize = 4096) { + throw new InvalidOperationException(@"Image sequences cannot be converted to Function values"); + } + + /// + /// This is not supported and will raise an exception. + /// + public override async Task Share (string? message = null) { + throw new InvalidOperationException(@"Image sequences cannot be shared"); + } + + /// + /// This is not supported and will raise an exception. + /// + public override async Task SaveToCameraRoll (string? album = null) { + throw new InvalidOperationException(@"Image sequences cannot be saved to the camera roll"); + } + + /// + public override void Delete () { + foreach (var asset in assets) + asset.Delete(); + } + #endregion + + + #region --Operations-- + + internal MediaSequenceAsset (string path, MediaAsset[] assets) { + this.path = path; + this.assets = assets; + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Assets/MediaSequenceAsset.cs.meta b/Runtime/Assets/MediaSequenceAsset.cs.meta new file mode 100644 index 0000000..a526c55 --- /dev/null +++ b/Runtime/Assets/MediaSequenceAsset.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: a748f5a9dc88d462dba45e4a497ba4e2 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Assets/TextAsset.cs b/Runtime/Assets/TextAsset.cs new file mode 100644 index 0000000..593ee1e --- /dev/null +++ b/Runtime/Assets/TextAsset.cs @@ -0,0 +1,99 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Assets { + + using AOT; + using System; + using System.IO; + using System.Runtime.InteropServices; + using System.Text; + using System.Threading.Tasks; + using UnityEngine; + using Function.Types; + using Newtonsoft.Json; + using NJsonSchema; + using NJsonSchema.Generation; + using Internal; + + /// + /// Text asset. + /// + public sealed class TextAsset : MediaAsset { + + #region --Client API-- + /// + /// Text contents. + /// + public readonly string text; + + /// + /// Parse a structured data model from the text asset's contents. + /// NOTE: This requires an active VideoKit AI plan. + /// + public async Task To () where T : struct { + // Generate schema + var settings = new JsonSchemaGeneratorSettings { + GenerateAbstractSchemas = false, + GenerateExamples = false, + UseXmlDocumentation = false, + ResolveExternalXmlDocumentation = false, + FlattenInheritanceHierarchy = false, + }; + var schema = JsonSchema.FromType(settings); + // Convert to structured + var prediction = await VideoKitSettings.Instance.fxn.Predictions.Create( + "@videokit/un2structured-v0-1", + inputs: new () { [@"text"] = text, [@"schema"] = schema.ToJson() } + ) as CloudPrediction; + // Check + if (!string.IsNullOrEmpty(prediction.error)) + throw new InvalidOperationException(prediction.error); + // Parse + var resultStr = prediction.results[0] as string; + var result = JsonConvert.DeserializeObject(resultStr); + // Return + return result; + } + + /// + public override async Task ToValue (int minUploadSize = 4096) { // DEPLOY + var buffer = Encoding.UTF8.GetBytes(text); + using var stream = new MemoryStream(buffer); + var value = await VideoKitSettings.Instance.fxn.Predictions.ToValue( + stream, + @"asset.txt", + Dtype.String, + minUploadSize: minUploadSize + ); + return value; + } + + /// + /// This is unsupported and will raise an exception. + /// + public override Task Share (string? message = null) { + throw new InvalidOperationException(@"Text asset cannot be shared"); + } + + /// + /// This is unsupported and will raise an exception. + /// + public override Task SaveToCameraRoll (string? album = null) { + throw new InvalidOperationException(@"Text asset cannot be saved to the camera roll"); + } + #endregion + + + #region --Operations-- + + internal TextAsset (string text) => this.text = text; + + public static implicit operator string (TextAsset asset) => asset.text; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Assets/TextAsset.cs.meta b/Runtime/Assets/TextAsset.cs.meta new file mode 100644 index 0000000..1c05dc4 --- /dev/null +++ b/Runtime/Assets/TextAsset.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: f1b7a39321a6342f7987a21011a56b74 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Assets/VideoAsset.cs b/Runtime/Assets/VideoAsset.cs new file mode 100644 index 0000000..2a1d49c --- /dev/null +++ b/Runtime/Assets/VideoAsset.cs @@ -0,0 +1,121 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Assets { + + using System.IO; + using System.Threading.Tasks; + using UnityEngine; + using Function.Types; + using Internal; + + /// + /// Video asset. + /// + public sealed class VideoAsset : MediaAsset { + + #region --Client API-- + /// + /// Video width. + /// + public readonly int width; + + /// + /// Video height. + /// + public readonly int height; + + /// + /// Video frame rate. + /// + public readonly float frameRate; + + /// + /// Audio sample rate. + /// + public readonly int sampleRate; + + /// + /// Audio channel count. + /// + public readonly int channelCount; + + /// + /// Video duration in seconds. + /// + public readonly float duration; + + /// + /// Playback the video asset in the native media player if supported. + /// + public void Playback () { + #if UNITY_EDITOR + //UnityEditor.EditorUtility.OpenWithDefaultApp(path); // errors on macOS for whatever reason + #elif UNITY_IOS || UNITY_ANDROID + Handheld.PlayFullScreenMovie($"file://{path}"); + #endif + } + + /// + /// Create a thumbnail image from the video. + /// + /// Approximate time to create the thumbnail from. + /// Thumbnail image. + internal Task CreateThumbnail (float time = 0f) { // INCOMPLETE + return default; + } + + /// + /// Trim the video based on the start and end time. + /// NOTE: This requires an active VideoKit plan. + /// + /// Start time in seconds. + /// Duration in seconds. If negative, trim to the end of the video. + /// Resulting trimmed video. + internal Task Trim (float start = 0f, float duration = -1f) { // INCOMPLETE + return default; + } + + /// + /// Reverse the video. + /// NOTE: This requires an active VideoKit plan. + /// + internal Task Reverse () { // INCOMPLETE + return default; + } + + /// + /// Extract the audio track from a video. + /// NOTE: This requires an active VideoKit plan. + /// + internal Task ExtractAudio () { // INCOMPLETE + return default; + } + + /// + public override async Task ToValue (int minUploadSize = 4096) { // DEPLOY + using var stream = OpenReadStream(); + var fxn = VideoKitSettings.Instance.fxn; + var name = Path.GetFileName(path); + var value = await fxn.Predictions.ToValue(stream, name, Dtype.Video, minUploadSize: minUploadSize); + return value; + } + #endregion + + + #region --Operations-- + + internal VideoAsset (string path, int width, int height, float frameRate, int sampleRate, int channelCount, float duration) { + this.path = path; + this.width = width; + this.height = height; + this.frameRate = frameRate; + this.sampleRate = sampleRate; + this.channelCount = channelCount; + this.duration = duration; + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Assets/VideoAsset.cs.meta b/Runtime/Assets/VideoAsset.cs.meta new file mode 100644 index 0000000..51d6a69 --- /dev/null +++ b/Runtime/Assets/VideoAsset.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: a4bf12628ef5a4693bb4d37f020e7d2e +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Clocks.meta b/Runtime/Clocks.meta new file mode 100644 index 0000000..7ae2abb --- /dev/null +++ b/Runtime/Clocks.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 3d532d93d71a0424b91ba8fe9f16e712 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Clocks/FixedIntervalClock.cs b/Runtime/Clocks/FixedIntervalClock.cs new file mode 100644 index 0000000..f51622f --- /dev/null +++ b/Runtime/Clocks/FixedIntervalClock.cs @@ -0,0 +1,57 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Recorders.Clocks { + + using System.Runtime.CompilerServices; + + /// + /// Clock that produces timestamps spaced at a fixed interval. + /// This clock is useful for enforcing a fixed framerate in a recording. + /// + public sealed class FixedIntervalClock : IClock { + + #region --Client API-- + /// + /// Interval between consecutive timestamps generated by the clock in seconds. + /// + public readonly double interval; + + /// + /// Current timestamp in nanoseconds. + /// The very first value reported by this property will always be zero. + /// + public long timestamp { + [MethodImpl(MethodImplOptions.Synchronized)] + get => (long)((autoTick ? ticks++ : ticks) * interval * 1e+9); + } + + /// + /// Create a fixed interval clock for a given framerate. + /// + /// Desired framerate for clock's timestamps. + /// Optional. If true, the clock will tick when its `Timestamp` is accessed. + public FixedIntervalClock (float framerate, bool autoTick = true) { + this.interval = 1.0 / framerate; + this.ticks = 0L; + this.autoTick = autoTick; + } + + /// + /// Advance the clock by its time interval. + /// + [MethodImpl(MethodImplOptions.Synchronized)] + public void Tick () => ticks++; + #endregion + + + #region --Operations-- + private readonly bool autoTick; + private long ticks; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Clocks/FixedIntervalClock.cs.meta b/Runtime/Clocks/FixedIntervalClock.cs.meta new file mode 100644 index 0000000..49ab153 --- /dev/null +++ b/Runtime/Clocks/FixedIntervalClock.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: b97737cbc7d354aa297f55f6a88dc865 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Clocks/IClock.cs b/Runtime/Clocks/IClock.cs new file mode 100644 index 0000000..54ab631 --- /dev/null +++ b/Runtime/Clocks/IClock.cs @@ -0,0 +1,22 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Recorders.Clocks { + + /// + /// Clock for generating recording timestamps. + /// Clocks are important for synchronizing audio and video tracks when recording with audio. + /// Clocks are thread-safe, so they can be used on multiple threads simultaneously. + /// + public interface IClock { + + /// + /// Current timestamp in nanoseconds. + /// + long timestamp { get; } + } +} \ No newline at end of file diff --git a/Runtime/Clocks/IClock.cs.meta b/Runtime/Clocks/IClock.cs.meta new file mode 100644 index 0000000..646c93d --- /dev/null +++ b/Runtime/Clocks/IClock.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 444cba47949ba4c74a297c71da719709 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Clocks/RealtimeClock.cs b/Runtime/Clocks/RealtimeClock.cs new file mode 100644 index 0000000..77181ff --- /dev/null +++ b/Runtime/Clocks/RealtimeClock.cs @@ -0,0 +1,53 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Recorders.Clocks { + + using System; + using System.Runtime.CompilerServices; + using Stopwatch = System.Diagnostics.Stopwatch; + + /// + /// Realtime clock for generating timestamps + /// + public sealed class RealtimeClock : IClock { + + #region --Client API-- + /// + /// Current timestamp in nanoseconds. + /// The very first value reported by this property will always be zero. + /// + public long timestamp { + [MethodImpl(MethodImplOptions.Synchronized)] + get { + var time = stopwatch.Elapsed.Ticks * 100L; + if (!stopwatch.IsRunning) + stopwatch.Start(); + return time; + } + } + + /// + /// Is the clock paused? + /// + public bool paused { + [MethodImpl(MethodImplOptions.Synchronized)] get => !stopwatch.IsRunning; + [MethodImpl(MethodImplOptions.Synchronized)] set => (value ? (Action)stopwatch.Stop : stopwatch.Start)(); + } + + /// + /// Create a realtime clock. + /// + public RealtimeClock () => this.stopwatch = new Stopwatch(); + #endregion + + + #region --Operations-- + private readonly Stopwatch stopwatch; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Clocks/RealtimeClock.cs.meta b/Runtime/Clocks/RealtimeClock.cs.meta new file mode 100644 index 0000000..c82b331 --- /dev/null +++ b/Runtime/Clocks/RealtimeClock.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: db948aeac09b54ddc98518263345be10 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Devices.meta b/Runtime/Devices.meta new file mode 100644 index 0000000..b1642ac --- /dev/null +++ b/Runtime/Devices.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 03a14781e85d44bbe94ba9acc0479b17 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Devices/AudioBuffer.cs b/Runtime/Devices/AudioBuffer.cs new file mode 100644 index 0000000..ae97d22 --- /dev/null +++ b/Runtime/Devices/AudioBuffer.cs @@ -0,0 +1,101 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Devices { + + using System; + using Unity.Collections; + using Unity.Collections.LowLevel.Unsafe; + using Internal; + + /// + /// Audio buffer provided by an audio device. + /// The audio buffer always contains a linear PCM sample buffer interleaved by channel. + /// + public readonly struct AudioBuffer { + + #region --Client API-- + /// + /// Audio device that this buffer was generated from. + /// + public readonly AudioDevice device; + + /// + /// Audio sample buffer. + /// + public unsafe readonly NativeArray sampleBuffer; + + /// + /// Audio buffer sample rate. + /// + public readonly int sampleRate; + + /// + /// Audio buffer channel count. + /// + public readonly int channelCount; + + /// + /// Audio buffer timestamp in nanoseconds. + /// The timestamp is based on the system media clock. + /// + public readonly long timestamp; + + /// + /// Safely clone the audio buffer. + /// The clone will never contain a valid sample buffer + /// + public AudioBuffer Clone () => new AudioBuffer( + device, + default, + sampleRate, + channelCount, + timestamp + ); + #endregion + + + #region --Operations-- + internal readonly IntPtr nativeBuffer; + + internal unsafe AudioBuffer ( + AudioDevice device, + IntPtr audioBuffer + ) { + this.device = device; + this.nativeBuffer = audioBuffer; + this.sampleBuffer = Wrap(audioBuffer.AudioBufferData(), audioBuffer.AudioBufferSampleCount()); + this.sampleRate = audioBuffer.AudioBufferSampleRate(); + this.channelCount = audioBuffer.AudioBufferChannelCount(); + this.timestamp = audioBuffer.AudioBufferTimestamp(); + } + + private AudioBuffer ( + AudioDevice device, + NativeArray sampleBuffer, + int sampleRate, + int channelCount, + long timestamp + ) { + this.device = device; + this.sampleBuffer = sampleBuffer; + this.sampleRate = sampleRate; + this.channelCount = channelCount; + this.timestamp = timestamp; + this.nativeBuffer = default; + } + + private static unsafe NativeArray Wrap (float* buffer, int size) { + var nativeArray = NativeArrayUnsafeUtility.ConvertExistingDataToNativeArray(buffer, size, Allocator.None); + #if ENABLE_UNITY_COLLECTIONS_CHECKS + NativeArrayUnsafeUtility.SetAtomicSafetyHandle(ref nativeArray, AtomicSafetyHandle.Create()); + #endif + return nativeArray; + } + + private static NativeArray Wrap (float[] buffer) => new NativeArray(buffer, Allocator.Temp); + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Devices/AudioBuffer.cs.meta b/Runtime/Devices/AudioBuffer.cs.meta new file mode 100644 index 0000000..471322a --- /dev/null +++ b/Runtime/Devices/AudioBuffer.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 33b41580dfbee4624831d45389dcc85e +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Devices/AudioDevice.cs b/Runtime/Devices/AudioDevice.cs new file mode 100644 index 0000000..0c0ee5f --- /dev/null +++ b/Runtime/Devices/AudioDevice.cs @@ -0,0 +1,193 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Devices { + + using System; + using System.Linq; + using System.Runtime.InteropServices; + using System.Text; + using System.Threading.Tasks; + using AOT; + using UnityEngine; + using Internal; + using Utilities; + using DeviceFlags = Internal.VideoKit.DeviceFlags; + + /// + /// Hardware audio input device. + /// + public sealed class AudioDevice : MediaDevice { + + #region --Properties-- + /// + /// Is echo cancellation supported? + /// + public bool echoCancellationSupported => device.Flags().HasFlag(DeviceFlags.EchoCancellation); + + /// + /// Enable or disable Adaptive Echo Cancellation (AEC). + /// + public bool echoCancellation { + get => device.EchoCancellation(); + set => device.SetEchoCancellation(value); + } + + /// + /// Audio sample rate. + /// + public int sampleRate { + get => device.SampleRate(); + set => device.SetSampleRate(value); + } + + /// + /// Audio channel count. + /// + public int channelCount { + get => device.ChannelCount(); + set => device.SetChannelCount(value); + } + #endregion + + + #region --Streaming-- + /// + /// Start running. + /// NOTE: This requires an active VideoKit plan. + /// + /// Delegate to receive audio buffers. + public unsafe void StartRunning (Action handler) { + Action wrapper = sampleBuffer => handler?.Invoke(new AudioBuffer(this, sampleBuffer)); + streamHandle = GCHandle.Alloc(wrapper, GCHandleType.Normal); + lifecycleHelper = LifecycleHelper.Create(); + lifecycleHelper.onQuit += StopRunning; + try { + device.StartRunning(OnAudioBuffer, (IntPtr)streamHandle).CheckStatus(); + } catch (Exception ex) { + streamHandle.Free(); + streamHandle = default; + throw; + } + } + + /// + /// Stop running. + /// + public override void StopRunning () { + if (lifecycleHelper) + lifecycleHelper.Dispose(); + device.StopRunning().CheckStatus(); + if (streamHandle != default) + streamHandle.Free(); + streamHandle = default; + lifecycleHelper = default; + } + #endregion + + + #region --Discovery-- + /// + /// Check the current microphone permission status. + /// + /// Request permissions if the user has not yet been asked. + /// Current microphone permissions status. + public static Task CheckPermissions (bool request = true) => CheckPermissions(VideoKit.PermissionType.Microphone, request); + + /// + /// Discover available audio input devices. + /// + /// Configure the application's global audio session for audio device discovery. This is required for discovering audio devices on iOS. + public static async Task Discover (bool configureAudioSession = true) { + // Check session + await VideoKitSettings.Instance.CheckSession(); + // Configure audio session + if (configureAudioSession) + VideoKitExt.ConfigureAudioSession(); + // Discover + var devices = await DiscoverNative(); + // Return + return devices; + } + #endregion + + + #region --Operations-- + private GCHandle streamHandle; + private LifecycleHelper lifecycleHelper; + + private int priority { // #24 + get { + var order = 0; + if (!defaultForMediaType) + order += 1; + if (location == Location.External) + order += 10; + if (location == Location.Unknown) + order += 100; + return order; + } + } + + internal AudioDevice (IntPtr device) : base(device) { } + + public override string ToString () => $"microphone:{uniqueID}"; + + private static Task DiscoverNative () { + // Discover + var tcs = new TaskCompletionSource(); + var handle = GCHandle.Alloc(tcs, GCHandleType.Normal); + try { + VideoKit.DiscoverMicrophones(OnDiscoverMicrophones, (IntPtr)handle).CheckStatus(); + } catch (Exception ex) { + handle.Free(); + tcs.SetException(ex); + } + // Return + return tcs.Task; + } + + [MonoPInvokeCallback(typeof(VideoKit.DeviceDiscoveryHandler))] + private static unsafe void OnDiscoverMicrophones (IntPtr context, IntPtr devices, int count) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + // Get tcs + TaskCompletionSource tcs; + try { + var handle = (GCHandle)context; + tcs = handle.Target as TaskCompletionSource; + handle.Free(); + } catch (Exception ex) { + Debug.LogException(ex); + return; + } + // Invoke + var microphones = Enumerable + .Range(0, count) + .Select(idx => new AudioDevice(((IntPtr*)devices)[idx])) + .OrderBy(microphone => microphone.priority) + .ToArray(); + tcs?.SetResult(microphones); + } + + [MonoPInvokeCallback(typeof(VideoKit.SampleBufferHandler))] + private static void OnAudioBuffer (IntPtr context, IntPtr sampleBuffer) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + // Invoke + try { + var handle = (GCHandle)context; + var handler = handle.Target as Action; + handler?.Invoke(sampleBuffer); + } + catch (Exception ex) { + Debug.LogException(ex); + } + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Devices/AudioDevice.cs.meta b/Runtime/Devices/AudioDevice.cs.meta new file mode 100644 index 0000000..6d0d742 --- /dev/null +++ b/Runtime/Devices/AudioDevice.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 4ab8f543c4f3b40f3b8374b82df26d48 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Devices/CameraDevice.cs b/Runtime/Devices/CameraDevice.cs new file mode 100644 index 0000000..d598163 --- /dev/null +++ b/Runtime/Devices/CameraDevice.cs @@ -0,0 +1,536 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Devices { + + using AOT; + using System; + using System.Collections.Generic; + using System.Linq; + using System.Runtime.InteropServices; + using System.Text; + using System.Threading; + using System.Threading.Tasks; + using UnityEngine; + using Internal; + using Utilities; + using DeviceFlags = Internal.VideoKit.DeviceFlags; + + /// + /// Hardware camera device. + /// + public sealed class CameraDevice : MediaDevice { + + #region --Types-- + /// + /// Exposure mode. + /// + public enum ExposureMode : int { + /// + /// Continuous auto exposure. + /// + Continuous = 0, + /// + /// Locked auto exposure. + /// + Locked = 1, + /// + /// Manual exposure. + /// + Manual = 2 + } + + /// + /// Photo flash mode. + /// + public enum FlashMode : int { + /// + /// Never use flash. + /// + Off = 0, + /// + /// Always use flash. + /// + On = 1, + /// + /// Let the sensor detect if it needs flash. + /// + Auto = 2 + } + + /// + /// Focus mode. + /// + public enum FocusMode : int { + /// + /// Continuous autofocus. + /// + Continuous = 0, + /// + /// Locked focus. + /// + Locked = 1, + } + + /// + /// Torch mode. + /// + public enum TorchMode : int { + /// + /// Disabled torch. + /// + Off = 0, + /// + /// Maximum supported torch level. + /// + Maximum = 100 + } + + /// + /// Video stabilization mode. + /// + public enum VideoStabilizationMode : int { + /// + /// Disabled video stabilization. + /// + Off = 0, + /// + /// Standard video stabilization. + /// + Standard = 1 + } + + /// + /// White balance mode. + /// + public enum WhiteBalanceMode : int { + /// + /// Continuous auto white balance. + /// + Continuous = 0, + /// + /// Locked auto white balance. + /// + Locked = 1, + } + #endregion + + + #region --Properties-- + /// + /// Is this camera front facing? + /// + public bool frontFacing => device.Flags().HasFlag(DeviceFlags.FrontFacing); + + /// + /// Is flash supported for photo capture? + /// + public bool flashSupported => device.Flags().HasFlag(DeviceFlags.Flash); + + /// + /// Is torch supported? + /// + public bool torchSupported => device.Flags().HasFlag(DeviceFlags.Torch); + + /// + /// Is setting the exposure point supported? + /// + public bool exposurePointSupported => device.Flags().HasFlag(DeviceFlags.ExposurePoint); + + /// + /// Is setting the focus point supported? + /// + public bool focusPointSupported => device.Flags().HasFlag(DeviceFlags.FocusPoint); + + /// + /// Is depth streaming supported? + /// + internal bool depthStreamingSupported => device.Flags().HasFlag(DeviceFlags.Depth); + + /// + /// Field of view in degrees. + /// + public (float width, float height) fieldOfView { + get { + device.FieldOfView(out var width, out var height); + return (width, height); + } + } + + /// + /// Exposure bias range in EV. + /// + public (float min, float max) exposureBiasRange { + get { + device.ExposureBiasRange(out var min, out var max); + return (min, max); + } + } + + /// + /// Exposure duration range in seconds. + /// + public (float min, float max) exposureDurationRange { + get { + device.ExposureDurationRange(out var min, out var max); + return (min, max); + } + } + + /// + /// Sensor sensitivity range. + /// + public (float min, float max) ISORange { + get { + device.ISORange(out var min, out var max); + return (min, max); + } + } + + /// + /// Zoom ratio range. + /// + public (float min, float max) zoomRange { + get { + device.ZoomRange(out var min, out var max); + return (min, max); + } + } + + /// + /// Get or set the preview resolution. + /// + public (int width, int height) previewResolution { + get { device.PreviewResolution(out var width, out var height); return (width, height); } + set => device.SetPreviewResolution(value.width, value.height); + } + + /// + /// Get or set the photo resolution. + /// + public (int width, int height) photoResolution { + get { device.PhotoResolution(out var width, out var height); return (width, height); } + set => device.SetPhotoResolution(value.width, value.height); + } + + /// + /// Get or set the preview framerate. + /// + public int frameRate { + get => device.FrameRate(); + set => device.SetFrameRate(value); + } + + /// + /// Get or set the exposure mode. + /// If the requested exposure mode is not supported, the camera device will ignore. + /// + public ExposureMode exposureMode { + get => device.ExposureMode(); + set => device.SetExposureMode(value); + } + + /// + /// Get or set the exposure bias. + /// This value must be in the range returned by `exposureRange`. + /// + public float exposureBias { + get => device.ExposureBias(); + set => device.SetExposureBias(value); + } + + /// + /// Get or set the current exposure duration in seconds. + /// + public float exposureDuration => device.ExposureDuration(); + + /// + /// Get or set the current exposure sensitivity. + /// + public float ISO => device.ISO(); + + /// + /// Get or set the photo flash mode. + /// + public FlashMode flashMode { + get => device.FlashMode(); + set => device.SetFlashMode(value); + } + + /// + /// Get or set the focus mode. + /// + public FocusMode focusMode { + get => device.FocusMode(); + set => device.SetFocusMode(value); + } + + /// + /// Get or set the torch mode. + /// + public TorchMode torchMode { + get => device.TorchMode(); + set => device.SetTorchMode(value); + } + + /// + /// Get or set the white balance mode. + /// + public WhiteBalanceMode whiteBalanceMode { + get => device.WhiteBalanceMode(); + set => device.SetWhiteBalanceMode(value); + } + + /// + /// Get or set the video stabilization mode. + /// + public VideoStabilizationMode videoStabilizationMode { + get => device.VideoStabilizationMode(); + set => device.SetVideoStabilizationMode(value); + } + + /// + /// Get or set the zoom ratio. + /// This value must be in the range returned by `zoomRange`. + /// + public float zoomRatio { + get => device.ZoomRatio(); + set => device.SetZoomRatio(value); + } + #endregion + + + #region --Controls-- + /// + /// Check if a given exposure mode is supported by the camera device. + /// + /// Exposure mode. + public bool ExposureModeSupported (ExposureMode mode) => mode switch { + ExposureMode.Continuous => device.Flags().HasFlag(DeviceFlags.ExposureContinuous), + ExposureMode.Locked => device.Flags().HasFlag(DeviceFlags.ExposureLock), + ExposureMode.Manual => device.Flags().HasFlag(DeviceFlags.ExposureManual), + _ => false + }; + + /// + /// Check if a given focus mode is supported by the camera device. + /// + /// Focus mode. + public bool FocusModeSupported (FocusMode mode) => mode switch { + FocusMode.Continuous => device.Flags().HasFlag(DeviceFlags.FocusContinuous), + FocusMode.Locked => device.Flags().HasFlag(DeviceFlags.FocusLock), + _ => false, + }; + + /// + /// Check if a given white balance mode is supported by the camera device. + /// + /// White balance mode. + public bool WhiteBalanceModeSupported (WhiteBalanceMode mode) => mode switch { + WhiteBalanceMode.Continuous => device.Flags().HasFlag(DeviceFlags.WhiteBalanceContinuous), + WhiteBalanceMode.Locked => device.Flags().HasFlag(DeviceFlags.WhiteBalanceLock), + _ => false + }; + + /// + /// Check if a given video stabilization mode is supported by the camera device. + /// + /// Video stabilization mode. + public bool VideoStabilizationModeSupported (VideoStabilizationMode mode) => mode switch { + VideoStabilizationMode.Off => true, + VideoStabilizationMode.Standard => device.Flags().HasFlag(DeviceFlags.VideoStabilization), + _ => false + }; + + /// + /// Set manual exposure. + /// + /// Exposure duration in seconds. MUST be in `exposureDurationRange`. + /// Sensor sensitivity ISO value. MUST be in `ISORange`. + public void SetExposureDuration (float duration, float ISO) => device.SetExposureDuration(duration, ISO); + + /// + /// Set the exposure point of interest. + /// The point is specified in normalized coordinates in range [0.0, 1.0]. + /// + /// Normalized x coordinate. + /// Normalized y coordinate. + public void SetExposurePoint (float x, float y) => device.SetExposurePoint(x, y); + + /// + /// Set the focus point of interest. + /// The point is specified in normalized coordinates in range [0.0, 1.0]. + /// + /// Normalized x coordinate. + /// Normalized y coordinate. + public void SetFocusPoint (float x, float y) => device.SetFocusPoint(x, y); + #endregion + + + #region --Streaming-- + /// + /// Start the camera preview. + /// + /// Delegate to receive preview image frames. + public void StartRunning (Action handler) { + Action wrapper = sampleBuffer => handler?.Invoke(new CameraImage(this, sampleBuffer)); + previewHandle = GCHandle.Alloc(wrapper, GCHandleType.Normal); + lifecycleHelper = LifecycleHelper.Create(); + lifecycleHelper.onQuit += StopRunning; + try { + device.StartRunning(OnPreviewImage, (IntPtr)previewHandle).CheckStatus(); + } catch (Exception) { + previewHandle.Free(); + previewHandle = default; + throw; + } + } + + /// + /// Start the camera preview with depth streaming. + /// + /// Delegate to receive preview image and depth frames. + internal void StartRunning (Action handler) { + throw new NotImplementedException(); + } + + /// + /// Stop the camera preview. + /// + public override void StopRunning () { + device.StopRunning().CheckStatus(); + lifecycleHelper?.Dispose(); + lifecycleHelper = default; + if (previewHandle != default) + previewHandle.Free(); + previewHandle = default; + } + + /// + /// Capture a photo. + /// + /// Delegate to receive high-resolution photo. + public void CapturePhoto (Action handler) { + Action wrapper = sampleBuffer => handler?.Invoke(new CameraImage(this, sampleBuffer)); + var handle = GCHandle.Alloc(wrapper, GCHandleType.Normal); + device.CapturePhoto(OnPhotoImage, (IntPtr)handle); + } + #endregion + + + #region --Discovery-- + /// + /// Check the current camera permission status. + /// + /// Request permissions if the user has not yet been asked. + /// Camera permission status. + public static Task CheckPermissions (bool request = true) => CheckPermissions(VideoKit.PermissionType.Camera, request); + + /// + /// Discover available camera devices. + /// + public static async Task Discover () { + // Check session + await VideoKitSettings.Instance.CheckSession(); + // Discover + var devices = await DiscoverNative(); + // Return + return devices; + } + #endregion + + + #region --Operations-- + private GCHandle previewHandle; + private LifecycleHelper lifecycleHelper; + + private int priority { // #24 + get { + var order = 0; + if (!defaultForMediaType) + order += 1; + if (location == Location.External) + order += 10; + if (location == Location.Unknown) + order += 100; + return order; + } + } + + internal CameraDevice (IntPtr device) : base(device) { } + + public override string ToString () => $"camera:{uniqueID}"; + + private static Task DiscoverNative () { + // Discover + var tcs = new TaskCompletionSource(); + var handle = GCHandle.Alloc(tcs, GCHandleType.Normal); + try { + VideoKit.DiscoverCameras(OnDiscoverCameras, (IntPtr)handle).CheckStatus(); + } catch (Exception ex) { + handle.Free(); + tcs.SetException(ex); + } + // Return + return tcs.Task; + } + + [MonoPInvokeCallback(typeof(VideoKit.DeviceDiscoveryHandler))] + private static unsafe void OnDiscoverCameras (IntPtr context, IntPtr devices, int count) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + // Get tcs + TaskCompletionSource tcs; + try { + var handle = (GCHandle)context; + tcs = handle.Target as TaskCompletionSource; + handle.Free(); + } catch (Exception ex) { + Debug.LogException(ex); + return; + } + // Invoke + var cameras = Enumerable + .Range(0, count) + .Select(idx => new CameraDevice(((IntPtr*)devices)[idx])) + .OrderBy(camera => camera.priority) + .ToArray(); + tcs?.SetResult(cameras); + } + + [MonoPInvokeCallback(typeof(VideoKit.SampleBufferHandler))] + private static unsafe void OnPreviewImage (IntPtr context, IntPtr sampleBuffer) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + try { + // Invoke + var handle = (GCHandle)context; + var handler = handle.Target as Action; + handler?.Invoke(sampleBuffer); + } catch (Exception ex) { + Debug.LogException(ex); + } + } + + [MonoPInvokeCallback(typeof(VideoKit.SampleBufferHandler))] + private static unsafe void OnPhotoImage (IntPtr context, IntPtr sampleBuffer) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + try { + // Invoke + var handle = (GCHandle)context; + var handler = handle.Target as Action; + handle.Free(); + handler?.Invoke(sampleBuffer); + } catch (Exception ex) { + Debug.LogException(ex); + } + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Devices/CameraDevice.cs.meta b/Runtime/Devices/CameraDevice.cs.meta new file mode 100644 index 0000000..fc352de --- /dev/null +++ b/Runtime/Devices/CameraDevice.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 97658c4daed614704beb1a8fb7e3412a +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Devices/CameraImage.cs b/Runtime/Devices/CameraImage.cs new file mode 100644 index 0000000..7e78b9a --- /dev/null +++ b/Runtime/Devices/CameraImage.cs @@ -0,0 +1,312 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Devices { + + using System; + using Unity.Collections; + using Unity.Collections.LowLevel.Unsafe; + using Internal; + using Metadata = Internal.VideoKit.Metadata; + + /// + /// Camera image provided by a camera device. + /// The camera image always contains a pixel buffer along with image metadata. + /// The format of the pixel buffer varies by platform and must be taken into consideration when using the pixel data. + /// + public readonly partial struct CameraImage { + + #region --Enumerations-- + /// + /// Image buffer format. + /// + public enum Format { // CHECK // VideoKit.h + /// + /// Unknown image format. + /// + Unknown = 0, + /// + /// Generic YUV 420 planar format. + /// + YCbCr420 = 1, + /// + /// RGBA8888. + /// + RGBA8888 = 2, + /// + /// BGRA8888. + /// + BGRA8888 = 3, + } + #endregion + + + #region --Types-- + /// + /// Image plane for planar formats. + /// + public readonly struct Plane { + + /// + /// Pixel buffer. + /// + public readonly NativeArray buffer; + + /// + /// Plane width. + /// + public readonly int width; + + /// + /// Plane height. + /// + public readonly int height; + + /// + /// Row stride in bytes. + /// + public readonly int rowStride; + + /// + /// Pixel stride in bytes. + /// + public readonly int pixelStride; + + /// + /// Create a camera image plane. + /// + public Plane (NativeArray buffer, int width, int height, int rowStride, int pixelStride) { + this.buffer = buffer; + this.width = width; + this.height = height; + this.rowStride = rowStride; + this.pixelStride = pixelStride; + } + } + #endregion + + + #region --Properties-- + /// + /// Camera device that this image was generated from. + /// + public readonly CameraDevice device; + + /// + /// Image pixel buffer. + /// Some planar images might not have a contiguous pixel buffer. + /// In this case, the buffer is uninitialized. + /// + public readonly NativeArray pixelBuffer; + + /// + /// Image format. + /// + public readonly Format format; + + /// + /// Image width. + /// + public readonly int width; + + /// + /// Image height. + /// + public readonly int height; + + /// + /// Image row stride in bytes. + /// This is zero if the image is planar. + /// + public readonly int rowStride; + + /// + /// Image timestamp in nanoseconds. + /// The timestamp is based on the system media clock. + /// + public readonly long timestamp; + + /// + /// Whether the image is vertically mirrored. + /// + public readonly bool verticallyMirrored; + + /// + /// Image plane for planar formats. + /// This is `null` for interleaved formats. + /// + public readonly Plane[] planes; + + /// + /// Camera intrinsics as a flattened row-major 3x3 matrix. + /// This is `null` if the camera does not report it. + /// + public readonly float[] intrinsics; + + /// + /// Exposure bias value in EV. + /// This is `null` if the camera does not report it. + /// + public readonly float? exposureBias; + + /// + /// Image exposure duration in seconds. + /// This is `null` if the camera does not report it. + /// + public readonly float? exposureDuration; + + /// + /// Sensor sensitivity ISO value. + /// This is `null` if the camera does not report it. + /// + public readonly float? ISO; + + /// + /// Camera focal length in millimeters. + /// This is `null` if the camera does not report it. + /// + public readonly float? focalLength; + + /// + /// Image aperture, in f-number. + /// This is `null` if the camera does not report it. + /// + public readonly float? fNumber; + + /// + /// Ambient brightness. + /// This is `null` if the camera does not report it. + /// + public readonly float? brightness; + + /// + /// Safely clone the camera image. + /// The clone will never contain a valid pixel buffer or plane data. + /// + public CameraImage Clone () => new CameraImage( + device, + default, + format, + width, + height, + 0, + timestamp, + verticallyMirrored, + null, + intrinsics, + exposureBias, + exposureDuration, + ISO, + focalLength, + fNumber, + brightness + ); + #endregion + + + #region --Operations-- + internal readonly IntPtr nativeImage; + + /// + /// Create a camera image. + /// + internal unsafe CameraImage ( + CameraDevice device, + IntPtr image + ) { + this.device = device; + this.nativeImage = image; + this.format = image.CameraImageFormat(); + this.width = image.CameraImageWidth(); + this.height = image.CameraImageHeight(); + this.rowStride = image.CameraImageRowStride(); + this.timestamp = image.CameraImageTimestamp(); + this.verticallyMirrored = image.CameraImageVerticallyMirrored(); + // Pixel buffer + this.pixelBuffer = NativeArrayUnsafeUtility.ConvertExistingDataToNativeArray( + image.CameraImageData(), + image.CameraImageDataSize(), + Allocator.None + ); + #if ENABLE_UNITY_COLLECTIONS_CHECKS + NativeArrayUnsafeUtility.SetAtomicSafetyHandle(ref pixelBuffer, AtomicSafetyHandle.Create()); + #endif + // Planes + var planeCount = image.CameraImagePlaneCount(); + this.planes = planeCount > 0 ? new CameraImage.Plane[planeCount] : null; + for (var i = 0; i < planes?.Length; ++i) { + var planeBuffer = NativeArrayUnsafeUtility.ConvertExistingDataToNativeArray( + image.CameraImagePlaneData(i), + image.CameraImagePlaneDataSize(i), + Allocator.None + ); + #if ENABLE_UNITY_COLLECTIONS_CHECKS + NativeArrayUnsafeUtility.SetAtomicSafetyHandle(ref planeBuffer, AtomicSafetyHandle.Create()); + #endif + planes[i] = new CameraImage.Plane( + planeBuffer, + image.CameraImagePlaneWidth(i), + image.CameraImagePlaneHeight(i), + image.CameraImagePlaneRowStride(i), + image.CameraImagePlanePixelStride(i) + ); + } + // Intrinsics + var intrinsics = new float[9]; + fixed (float* dst = intrinsics) + this.intrinsics = image.CameraImageMetadata(Metadata.IntrinsicMatrix, dst, intrinsics.Length) ? intrinsics : null; + // Metadata + var metadata = 0f; + this.exposureBias = image.CameraImageMetadata(Metadata.ExposureBias, &metadata) ? (float?)metadata : null; + this.exposureDuration = image.CameraImageMetadata(Metadata.ExposureDuration, &metadata) ? (float?)metadata : null; + this.ISO = image.CameraImageMetadata(Metadata.ISO, &metadata) ? (float?)metadata : null; + this.focalLength = image.CameraImageMetadata(Metadata.FocalLength, &metadata) ? (float?)metadata : null; + this.fNumber = image.CameraImageMetadata(Metadata.FNumber, &metadata) ? (float?)metadata : null; + this.brightness = image.CameraImageMetadata(Metadata.Brightness, &metadata) ? (float?)metadata : null; + } + + /// + /// Create a camera image. + /// + private CameraImage ( + CameraDevice device, + NativeArray pixelBuffer, + Format format, + int width, + int height, + int rowStride, + long timestamp, + bool mirrored, + Plane[] planes = null, + float[] intrinsics = null, + float? exposureBias = null, + float? exposureDuration = null, + float? ISO = null, + float? focalLength = null, + float? fNumber = null, + float? brightness = null, + IntPtr nativeImage = default + ) { + this.device = device; + this.pixelBuffer = pixelBuffer; + this.format = format; + this.width = width; + this.height = height; + this.rowStride = rowStride; + this.timestamp = timestamp; + this.verticallyMirrored = mirrored; + this.planes = planes; + this.intrinsics = intrinsics; + this.exposureBias = exposureBias; + this.exposureDuration = exposureDuration; + this.ISO = ISO; + this.focalLength = focalLength; + this.fNumber = fNumber; + this.brightness = brightness; + this.nativeImage = nativeImage; + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Devices/CameraImage.cs.meta b/Runtime/Devices/CameraImage.cs.meta new file mode 100644 index 0000000..ddaa4f8 --- /dev/null +++ b/Runtime/Devices/CameraImage.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 6bbe03ba814ad4e2bb60b3ec96d1064c +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Devices/MediaDevice.cs b/Runtime/Devices/MediaDevice.cs new file mode 100644 index 0000000..b136737 --- /dev/null +++ b/Runtime/Devices/MediaDevice.cs @@ -0,0 +1,218 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Devices { + + using AOT; + using System; + using System.Collections; + using System.Runtime.InteropServices; + using System.Text; + using System.Threading.Tasks; + using UnityEngine; + using UnityEngine.Android; + using Internal; + using Utilities; + using DeviceFlags = Internal.VideoKit.DeviceFlags; + using PermissionType = Internal.VideoKit.PermissionType; + + /// + /// Media device which provides media sample buffers. + /// + public abstract class MediaDevice { + + #region --Enumerations-- + /// + /// Device location. + /// + public enum Location : int { // CHECK // Must match `VideoKit.h` + /// + /// Device type is unknown. + /// + Unknown = 0, + /// + /// Device is internal. + /// + Internal = 1 << 0, + /// + /// Device is external. + /// + External = 1 << 1, + } + + /// + /// Device permissions status. + /// + public enum PermissionStatus : int { + /// + /// User has not authorized or denied access to media device. + /// + Unknown = 0, + /// + /// User has denied access to media device. + /// + Denied = 2, + /// + /// User has authorized access to media device. + /// + Authorized = 3 + } + #endregion + + + #region --Properties-- + /// + /// Device unique ID. + /// + public string uniqueID { get; protected set; } + + /// + /// Display friendly device name. + /// + public string name { get; protected set; } + + /// + /// Device location. + /// + public virtual Location location => (Location)((int)device.Flags() & 0x3); + + /// + /// Device is the default device for its media type. + /// + public virtual bool defaultForMediaType => device.Flags().HasFlag(DeviceFlags.Default); + #endregion + + + #region --Events-- + /// + /// Event raised when the device is disconnected. + /// + public event Action onDisconnected; + #endregion + + + #region --Streaming-- + /// + /// Whether the device is running. + /// + public virtual bool running => device.Running(); + + /// + /// Stop running. + /// + public abstract void StopRunning (); + #endregion + + + #region --Operations-- + protected readonly IntPtr device; + private readonly GCHandle weakSelf; + + internal MediaDevice (IntPtr device) { + this.device = device; + this.weakSelf = GCHandle.Alloc(this, GCHandleType.Weak); + // Cache UID + var uidBuilder = new StringBuilder(2048); + device.UniqueID(uidBuilder); + this.uniqueID = uidBuilder.ToString(); + // Cache name + var nameBuilder = new StringBuilder(2048); + device.Name(nameBuilder); + this.name = nameBuilder.ToString(); + // Register handlers + device.SetDisconnectHandler(OnDeviceDisconnect, (IntPtr)weakSelf); + } + + ~MediaDevice () { + device.ReleaseDevice(); + weakSelf.Free(); + } + + public static implicit operator IntPtr (MediaDevice device) => device.device; + + protected static Task CheckPermissions (PermissionType type, bool request) => Application.platform switch { + RuntimePlatform.Android => CheckPermissionsAndroid(type, request), + _ => CheckPermissionsNative(type, request), + }; + + private static Task CheckPermissionsNative (PermissionType type, bool request) { + var tcs = new TaskCompletionSource(); + var handle = GCHandle.Alloc(tcs, GCHandleType.Normal); + try { + VideoKit.CheckPermissions(type, request, OnPermissionResult, (IntPtr)handle).CheckStatus(); + } catch (Exception ex) { + handle.Free(); + tcs.SetException(ex); + } + return tcs.Task; + } + + private static Task CheckPermissionsAndroid (PermissionType type, bool request) { + var permission = PermissionTypeToString(type); + // Check status + var status = GetPermissionStatus(permission); + if (!request || status == PermissionStatus.Authorized) + return Task.FromResult(status); + // Request + var tcs = new TaskCompletionSource(); + var callbacks = new PermissionCallbacks(); + callbacks.PermissionGranted += _ => tcs.SetResult(PermissionStatus.Authorized); + callbacks.PermissionDenied += _ => tcs.SetResult(PermissionStatus.Denied); + callbacks.PermissionDeniedAndDontAskAgain += _ => tcs.SetResult(PermissionStatus.Denied); + Permission.RequestUserPermission(permission, callbacks); + // Return + return tcs.Task; + // Utilities + static PermissionStatus GetPermissionStatus (string type) => Permission.HasUserAuthorizedPermission(type) switch { + true => MediaDevice.PermissionStatus.Authorized, + false => MediaDevice.PermissionStatus.Unknown + }; + static string PermissionTypeToString (PermissionType type) => type switch { + PermissionType.Microphone => Permission.Microphone, + PermissionType.Camera => Permission.Camera, + _ => null + }; + } + #endregion + + + #region --Callbacks-- + + [MonoPInvokeCallback(typeof(VideoKit.DeviceDisconnectHandler))] + private static void OnDeviceDisconnect (IntPtr context) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + try { + // Invoke + var handle = (GCHandle)context; + var device = handle.Target as MediaDevice; + device?.onDisconnected?.Invoke(); + } catch (Exception ex) { + Debug.LogException(ex); + } + } + + [MonoPInvokeCallback(typeof(VideoKit.PermissionResultHandler))] + private static void OnPermissionResult (IntPtr context, PermissionStatus status) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + // Get tcs + TaskCompletionSource tcs; + try { + var handle = (GCHandle)context; + tcs = handle.Target as TaskCompletionSource; + handle.Free(); + } catch (Exception ex) { + Debug.LogException(ex); + return; + } + // Invoke + tcs?.SetResult(status); + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Devices/MediaDevice.cs.meta b/Runtime/Devices/MediaDevice.cs.meta new file mode 100644 index 0000000..70b9f55 --- /dev/null +++ b/Runtime/Devices/MediaDevice.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 94cd9256724b8493ba03318452541751 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs.meta b/Runtime/Inputs.meta new file mode 100644 index 0000000..b23fa28 --- /dev/null +++ b/Runtime/Inputs.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 8936150c677824625aafd615f6355b6b +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/AsyncTextureInput.cs b/Runtime/Inputs/AsyncTextureInput.cs new file mode 100644 index 0000000..3ff1309 --- /dev/null +++ b/Runtime/Inputs/AsyncTextureInput.cs @@ -0,0 +1,62 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using UnityEngine; + using UnityEngine.Rendering; + using Unity.Collections.LowLevel.Unsafe; + + /// + /// Recorder input for recording video frames from textures. + /// Textures will be recorded by performing an asynchronous pixel buffer readback. + /// This texture input typically has better performance, but is not supported on all platforms. + /// Check for support using `SystemInfo.supportsAsyncGPURReadback`. + /// + public sealed class AsyncTextureInput : TextureInput { + + #region --Client API-- + /// + /// Create a texture input which performs asynchronous readbacks. + /// + /// Media recorder to receive video frames. + public AsyncTextureInput (MediaRecorder recorder) : base(recorder) => commit = true; + + /// + /// Commit a video frame from a texture. + /// + /// Source texture. + /// Frame timestamp in nanoseconds. + public override unsafe void CommitFrame (Texture texture, long timestamp) { + // Blit + var (width, height) = recorder.frameSize; + var renderTexture = RenderTexture.GetTemporary(width, height, 24, RenderTextureFormat.ARGB32); + Graphics.Blit(texture, renderTexture); + // Readback + // https://issuetracker.unity3d.com/issues/asyncgpureadback-dot-requestintonativearray-crashes-unity-when-trying-to-request-a-copy-to-the-same-nativearray-multiple-times + // https://issuetracker.unity3d.com/issues/asyncgpureadback-dot-requestintonativearray-causes-invalidoperationexception-on-nativearray + AsyncGPUReadback.Request(renderTexture, 0, request => { + if (commit) + recorder.CommitFrame(request.GetData(), timestamp); + }); + // Release + RenderTexture.ReleaseTemporary(renderTexture); + } + + /// + /// Stop recorder input and release resources. + /// + public override void Dispose () { + commit = false; + base.Dispose(); + } + #endregion + + + #region --Operations-- + private bool commit; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/AsyncTextureInput.cs.meta b/Runtime/Inputs/AsyncTextureInput.cs.meta new file mode 100644 index 0000000..28d093b --- /dev/null +++ b/Runtime/Inputs/AsyncTextureInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 0b498f173f5c74aaa9a0c3a411b48b08 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/AudioDeviceInput.cs b/Runtime/Inputs/AudioDeviceInput.cs new file mode 100644 index 0000000..1b2996c --- /dev/null +++ b/Runtime/Inputs/AudioDeviceInput.cs @@ -0,0 +1,56 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using System; + using Devices; + using Clocks; + + /// + /// Recorder input for recoridng audio sample buffers from an audio device. + /// + public sealed class AudioDeviceInput : IDisposable { + + #region --Client API-- + /// + /// Create an audio device input. + /// + /// Media recorder to receive audio frames. + /// Clock for generating timestamps. + /// Audio manager with running device. + public AudioDeviceInput (MediaRecorder recorder, IClock clock, VideoKitAudioManager audioManager) { + this.recorder = recorder; + this.clock = clock; + this.audioManager = audioManager; + audioManager.OnAudioBuffer += OnAudioBuffer; + } + + /// + /// Create an audio device input. + /// + /// Media recorder to receive audio frames. + /// Audio manager with running device. + public AudioDeviceInput (MediaRecorder recorder, VideoKitAudioManager audioManager) : this(recorder, default, audioManager) { } + + /// + /// Stop the recorder input and release resources. + /// + public void Dispose () => audioManager.OnAudioBuffer -= OnAudioBuffer; + #endregion + + + #region --Operations-- + private readonly MediaRecorder recorder; + private readonly IClock clock; + private readonly VideoKitAudioManager audioManager; + + private void OnAudioBuffer (AudioBuffer audioBuffer) => recorder.CommitSamples( + audioBuffer.sampleBuffer, + clock?.timestamp ?? 0L + ); + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/AudioDeviceInput.cs.meta b/Runtime/Inputs/AudioDeviceInput.cs.meta new file mode 100644 index 0000000..97cbd6a --- /dev/null +++ b/Runtime/Inputs/AudioDeviceInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 3cc17a16f9681418c9ee13c5be0d909a +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/AudioInput.cs b/Runtime/Inputs/AudioInput.cs new file mode 100644 index 0000000..11f69cc --- /dev/null +++ b/Runtime/Inputs/AudioInput.cs @@ -0,0 +1,75 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using System; + using UnityEngine; + using Clocks; + + /// + /// Recorder input for recording audio frames from an `AudioListener` or `AudioSource`. + /// + public sealed class AudioInput : IDisposable { + + #region --Client API-- + /// + /// Create an audio recording input from a scene's AudioListener. + /// + /// Media recorder to receive audio frames. + /// Clock for generating timestamps. Can be `null` if recorder does not require timestamps. + /// Audio listener for the current scene. + public AudioInput (MediaRecorder recorder, IClock clock, AudioListener listener) : this(recorder, clock, listener.gameObject) {} + + /// + /// Create an audio recording input from a scene's AudioListener. + /// + /// Media recorder to receive audio frames. + /// Audio listener for the current scene. + public AudioInput (MediaRecorder recorder, AudioListener listener) : this(recorder, default, listener) {} + + /// + /// Create an audio recording input from an AudioSource. + /// + /// Media recorder to receive audio frames. + /// Clock for generating timestamps. Can be `null` if recorder does not require timestamps. + /// Audio source to record. + public AudioInput (MediaRecorder recorder, IClock clock, AudioSource source) : this(recorder, clock, source.gameObject) {} + + /// + /// Create an audio recording input from an AudioSource. + /// + /// Media recorder to receive audio frames. + /// Audio source to record. + public AudioInput (MediaRecorder recorder, AudioSource source) : this(recorder, default, source) {} + + /// + /// Stop recorder input and release resources. + /// + public void Dispose () => AudioInputAttachment.DestroyImmediate(attachment); + #endregion + + + #region --Operations-- + private readonly MediaRecorder recorder; + private readonly IClock clock; + private readonly AudioInputAttachment attachment; + + private AudioInput (MediaRecorder recorder, IClock clock, GameObject gameObject) { + this.recorder = recorder; + this.clock = clock; + this.attachment = gameObject.AddComponent(); + this.attachment.sampleBufferDelegate = OnSampleBuffer; + } + + private void OnSampleBuffer (float[] data) => recorder.CommitSamples(data, clock?.timestamp ?? 0L); + + private class AudioInputAttachment : MonoBehaviour { + public Action sampleBufferDelegate; + private void OnAudioFilterRead (float[] data, int channels) => sampleBufferDelegate?.Invoke(data); + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/AudioInput.cs.meta b/Runtime/Inputs/AudioInput.cs.meta new file mode 100644 index 0000000..400de2e --- /dev/null +++ b/Runtime/Inputs/AudioInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 40aecc724a73f41a2bafc319f75e6a8a +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/AudioMixerInput.cs b/Runtime/Inputs/AudioMixerInput.cs new file mode 100644 index 0000000..f4c35da --- /dev/null +++ b/Runtime/Inputs/AudioMixerInput.cs @@ -0,0 +1,140 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using System; + using System.Runtime.CompilerServices; + using UnityEngine; + using Clocks; + using Devices; + using Utilities; + + /// + /// + internal sealed class AudioMixerInput : IDisposable { // INCOMPLETE // Sync + + #region --Client API-- + /// + /// Gain multiplier to apply to audio from audio device . + /// + public float audioDeviceGain = 1f; + + /// + /// Create an audio mixer input. + /// + /// Media recorder to receive audio frames. + /// Clock for generating timestamps. Can be `null` if recorder does not require timestamps. + /// Audio device to record microphone audio from. + /// Audio listener to record game audio from. + public AudioMixerInput (MediaRecorder recorder, IClock clock, VideoKitAudioManager audioManager, AudioListener audioListener) { + this.recorder = recorder; + this.clock = clock; + this.audioManager = audioManager; + this.attachment = audioListener.gameObject.AddComponent(); + this.deviceRingBuffer = new RingBuffer(RingBufferSize); + this.unityRingBuffer = new RingBuffer(RingBufferSize); + this.deviceSampleBuffer = new float[MixBufferSize]; + this.unitySampleBuffer = new float[MixBufferSize]; + this.mixedBuffer = new float[MixBufferSize]; + this.signal = new SharedSignal(2); + this.deviceFence = new object(); + this.unityFence = new object(); + signal.OnSignal += ClearBuffers; + audioManager.OnAudioBuffer += OnDeviceSampleBuffer; + attachment.OnSampleBuffer += OnUnitySampleBuffer; + } + + /// + /// Create an audio mixer input. + /// + /// Media recorder to receive audio frames. + /// Audio device to record microphone audio from. + /// Audio listener to record game audio from. + public AudioMixerInput (MediaRecorder recorder, VideoKitAudioManager audioManager, AudioListener audioListener) : this(recorder, default, audioManager, audioListener) { } + + /// + /// Stop the recorder input and release resources. + /// + public void Dispose () { + AudioMixerInputAttachment.DestroyImmediate(attachment); + audioManager.OnAudioBuffer -= OnDeviceSampleBuffer; + } + #endregion + + + #region --Operations-- + private readonly MediaRecorder recorder; + private readonly IClock clock; + private readonly VideoKitAudioManager audioManager; + private readonly AudioMixerInputAttachment attachment; + private readonly RingBuffer deviceRingBuffer; + private readonly RingBuffer unityRingBuffer; + private readonly float[] deviceSampleBuffer; + private readonly float[] unitySampleBuffer; + private readonly float[] mixedBuffer; + private readonly SharedSignal signal; + private readonly object deviceFence; + private readonly object unityFence; + private const int RingBufferSize = 16384; + private const int MixBufferSize = 1024; + + private void ClearBuffers () { + lock (deviceFence) + lock (unityFence) { + deviceRingBuffer.Clear(); + unityRingBuffer.Clear(); + } + } + + private void MixBuffers () { + while (true) { + // Read + lock (deviceFence) + lock (unityFence) { + if (deviceRingBuffer.Length < MixBufferSize || unityRingBuffer.Length < MixBufferSize) + return; + deviceRingBuffer.Read(deviceSampleBuffer); + unityRingBuffer.Read(unitySampleBuffer); + } + // Mix + Mix(deviceSampleBuffer, unitySampleBuffer, mixedBuffer); + // Commit + recorder.CommitSamples(mixedBuffer, clock?.timestamp ?? 0L); + } + } + + private void OnDeviceSampleBuffer (AudioBuffer audioBuffer) { + signal.Signal(deviceFence); + if (!signal.signaled) + return; + lock (deviceFence) + deviceRingBuffer.Write(audioBuffer.sampleBuffer); + MixBuffers(); + } + + private void OnUnitySampleBuffer (float[] sampleBuffer) { + signal.Signal(unityFence); + if (!signal.signaled) + return; + lock (unityFence) + unityRingBuffer.Write(sampleBuffer); + MixBuffers(); + } + + private void Mix (float[] srcA, float[] srcB, float[] dst) { + for (var i = 0; i < dst.Length; ++i) + dst[i] = FTH(audioDeviceGain * srcA[i] + srcB[i]); + [MethodImpl(MethodImplOptions.AggressiveInlining)] + static float FTH (float x) => 1f - (2f * (1f / (1f + Mathf.Exp(2 * x)))); + } + + private class AudioMixerInputAttachment : MonoBehaviour { + public event Action OnSampleBuffer; + private void OnAudioFilterRead (float[] data, int channels) => OnSampleBuffer?.Invoke(data); + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/AudioMixerInput.cs.meta b/Runtime/Inputs/AudioMixerInput.cs.meta new file mode 100644 index 0000000..0291d6c --- /dev/null +++ b/Runtime/Inputs/AudioMixerInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: c9eda878e43e2417c946b94541e3e1a5 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/CameraDeviceInput.cs b/Runtime/Inputs/CameraDeviceInput.cs new file mode 100644 index 0000000..95642e3 --- /dev/null +++ b/Runtime/Inputs/CameraDeviceInput.cs @@ -0,0 +1,85 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using System; + using Clocks; + using Devices; + + /// + /// Recorder input for recording video frames from a camera device. + /// + public sealed class CameraDeviceInput : IDisposable { + + #region --Client API-- + /// + /// Number of successive camera frames to skip while recording. + /// This is very useful for GIF recording, which typically has a lower framerate appearance. + /// + public int frameSkip; + + /// + /// Create a camera device input. + /// + /// Media recorder to receive camera frames. + /// Clock for generating timestamps. + /// Camera manager with running camera. + public CameraDeviceInput (MediaRecorder recorder, IClock clock, VideoKitCameraManager cameraManager) : this(TextureInput.CreateDefault(recorder), clock, cameraManager) { } + + /// + /// Create a camera device input. + /// + /// Media recorder to receive camera frames + /// Camera manager with running camera. + public CameraDeviceInput (MediaRecorder recorder, VideoKitCameraManager cameraManager) : this(recorder, default, cameraManager) { } + + /// + /// Create a camera device input. + /// + /// Texture input to receive camera frames. + /// Clock for generating timestamps. + /// Camera manager with running camera. + public CameraDeviceInput (TextureInput input, IClock clock, VideoKitCameraManager cameraManager) { + this.input = input; + this.clock = clock; + this.cameraManager = cameraManager; + this.frameIdx = 0; + cameraManager.OnCameraFrame.AddListener(OnCameraFrame); + } + + /// + /// Create a camera device input. + /// + /// Texture input to receive camera frames. + /// Camera manager with running camera. + public CameraDeviceInput (TextureInput input, VideoKitCameraManager cameraManager) : this(input, default, cameraManager) { } + + /// + /// Stop the recorder input and release resources. + /// + public void Dispose () { + cameraManager.OnCameraFrame.RemoveListener(OnCameraFrame); + input.Dispose(); + } + #endregion + + + #region --Operations-- + private readonly TextureInput input; + private readonly IClock clock; + private readonly VideoKitCameraManager cameraManager; + private int frameIdx; + + private void OnCameraFrame () { + // Check index + if (frameIdx++ % (frameSkip + 1) != 0) + return; + // Commit + input.CommitFrame(cameraManager.texture, clock?.timestamp ?? 0L); + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/CameraDeviceInput.cs.meta b/Runtime/Inputs/CameraDeviceInput.cs.meta new file mode 100644 index 0000000..753da52 --- /dev/null +++ b/Runtime/Inputs/CameraDeviceInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: c9992f76f90c7498396207a5217899d6 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/CameraInput.cs b/Runtime/Inputs/CameraInput.cs new file mode 100644 index 0000000..5e2f881 --- /dev/null +++ b/Runtime/Inputs/CameraInput.cs @@ -0,0 +1,135 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using System; + using System.Collections; + using System.Collections.Generic; + using UnityEngine; + using UnityEngine.Rendering; + using Clocks; + + /// + /// Recorder input for recording video frames from one or more game cameras. + /// + public class CameraInput : IDisposable { + + #region --Client API-- + /// + /// Cameras being recorded from. + /// + public readonly IReadOnlyList cameras; + + /// + /// Control number of successive camera frames to skip while recording. + /// This is very useful for GIF recording, which typically has a lower framerate appearance. + /// + public int frameSkip; + + /// + /// Create a video recording input from one or more game cameras. + /// + /// Media recorder to receive video frames. + /// Recording clock for generating timestamps. + /// Game cameras to record. + public CameraInput (MediaRecorder recorder, IClock clock, params Camera[] cameras) : this(TextureInput.CreateDefault(recorder), clock, cameras) { } + + /// + /// Create a video recording input from one or more game cameras. + /// + /// Media recorder to receive video frames. + /// Game cameras to record. + public CameraInput (MediaRecorder recorder, params Camera[] cameras) : this(recorder, default, cameras) { } + + /// + /// Create a video recording input from one or more game cameras. + /// + /// Texture input to receive video frames. + /// Recording clock for generating timestamps. + /// Game cameras to record. + public CameraInput (TextureInput input, IClock clock, params Camera[] cameras) { + // Sort cameras by depth + Array.Sort(cameras, (a, b) => (int)(100 * (a.depth - b.depth))); + var (width, height) = input.frameSize; + // Save state + this.input = input; + this.clock = clock; + this.cameras = cameras; + this.descriptor = new RenderTextureDescriptor(width, height, RenderTextureFormat.ARGBHalf, 24) { + sRGB = true, + msaaSamples = Mathf.Max(QualitySettings.antiAliasing, 1) + }; + // Start recording + attachment = new GameObject(@"VideoKit.CameraInput.Attachment").AddComponent(); + attachment.StartCoroutine(CommitFrames()); + } + + /// + /// Create a video recording input from one or more game cameras. + /// + /// Texture input to receive video frames. + /// Game cameras to record. + public CameraInput (TextureInput input, params Camera[] cameras) : this(input, default, cameras) { } + + /// + /// Stop recorder input and release resources. + /// + public void Dispose () { + GameObject.DestroyImmediate(attachment.gameObject); + input.Dispose(); + } + #endregion + + + #region --Operations-- + private readonly TextureInput input; + private readonly IClock clock; + private readonly RenderTextureDescriptor descriptor; + private readonly CameraInputAttachment attachment; + private int frameCount; + + private IEnumerator CommitFrames () { + var yielder = new WaitForEndOfFrame(); + for (;;) { + // Check frame index + yield return yielder; + if (frameCount++ % (frameSkip + 1) != 0) + continue; + // Render cameras + var frameBuffer = RenderTexture.GetTemporary(descriptor); + ClearFrame(frameBuffer); + for (var i = 0; i < cameras.Count; i++) + CommitFrame(cameras[i], frameBuffer); + // Commit + input.CommitFrame(frameBuffer, clock?.timestamp ?? 0L); + RenderTexture.ReleaseTemporary(frameBuffer); + } + } + + protected virtual void ClearFrame (RenderTexture renderTexture) { + var prevActive = RenderTexture.active; + RenderTexture.active = renderTexture; + GL.Clear(true, true, Color.black); + RenderTexture.active = prevActive; + } + + protected virtual void CommitFrame (Camera source, RenderTexture destination) { + var prevTarget = source.targetTexture; + source.targetTexture = destination; + source.Render(); + source.targetTexture = prevTarget; + } + + private sealed class CameraInputAttachment : MonoBehaviour { } + #endregion + + + #region --DEPRECATED-- + [Obsolete(@"Deprecated in NatCorder 1.9.3. This property is no longer necessary.")] + public bool HDR; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/CameraInput.cs.meta b/Runtime/Inputs/CameraInput.cs.meta new file mode 100644 index 0000000..13ad962 --- /dev/null +++ b/Runtime/Inputs/CameraInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 724ba1f1364cf4b0a99e083b610bf81e +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/CropTextureInput.cs b/Runtime/Inputs/CropTextureInput.cs new file mode 100644 index 0000000..9329f21 --- /dev/null +++ b/Runtime/Inputs/CropTextureInput.cs @@ -0,0 +1,91 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using UnityEngine; + + /// + /// Recorder input for recording a cropped region of committed video frames. + /// The crop texture input always preserves the aspect of the cropped image. + /// + public sealed class CropTextureInput : TextureInput, ITextureInput { + + #region --Client API-- + /// + public override (int width, int height) frameSize => input.frameSize; + + /// + /// Crop rect in pixel coordinates of the recorder. + /// + public RectInt rect; + + /// + /// Create a crop texture input. + /// + /// Media recorder to receive watermarked frames. + public CropTextureInput (MediaRecorder recorder) : this(CreateDefault(recorder)) { } + + /// + /// Create a crop texture input. + /// + /// Texture input to receive cropped frames. + public CropTextureInput (TextureInput input) : base(null) { + this.input = input; + this.rect = new RectInt(0, 0, input.frameSize.width, input.frameSize.height); + } + + /// + /// Commit a video frame from a texture. + /// + /// Source texture. + /// Frame timestamp in nanoseconds. + public override void CommitFrame (Texture texture, long timestamp) { + // Create frame buffer + var (width, height) = frameSize; + var descriptor = new RenderTextureDescriptor(width, height, RenderTextureFormat.ARGB32, 0); + var frameBuffer = RenderTexture.GetTemporary(descriptor); + // Render + (this as ITextureInput).CommitFrame(texture, frameBuffer); + // Commit + input.CommitFrame(frameBuffer, timestamp); + RenderTexture.ReleaseTemporary(frameBuffer); + } + + /// + /// Stop the recorder input and release resources. + /// + public override void Dispose () { + input.Dispose(); + base.Dispose(); + } + #endregion + + + #region --Operations-- + private readonly TextureInput input; + + void ITextureInput.CommitFrame (Texture source, RenderTexture destination) { + // Compute crop scale + var frameSize = new Vector2(destination.width, destination.height); + var ratio = new Vector2(frameSize.x / rect.width, frameSize.y / rect.height); + var scale = Mathf.Max(ratio.x, ratio.y); + // Compute draw rect + var pixelSize = scale * frameSize; + var minPoint = 0.5f * frameSize - scale * rect.center; + var maxPoint = minPoint + pixelSize; + var drawRect = new Rect(minPoint.x, destination.height - maxPoint.y, pixelSize.x, pixelSize.y); + // Render + var prevActive = RenderTexture.active; + RenderTexture.active = destination; + GL.PushMatrix(); + GL.LoadPixelMatrix(0, destination.width, destination.height, 0); + Graphics.DrawTexture(drawRect, source); + GL.PopMatrix(); + RenderTexture.active = prevActive; + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/CropTextureInput.cs.meta b/Runtime/Inputs/CropTextureInput.cs.meta new file mode 100644 index 0000000..d01fdce --- /dev/null +++ b/Runtime/Inputs/CropTextureInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 8d84024fbc4aa4b62a4d36038eb15e6b +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/CustomTextureInput.cs b/Runtime/Inputs/CustomTextureInput.cs new file mode 100644 index 0000000..8edbb34 --- /dev/null +++ b/Runtime/Inputs/CustomTextureInput.cs @@ -0,0 +1,98 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using System; + using System.Collections; + using UnityEngine; + using Clocks; + using Inputs; + + /// + /// Recorder input for recording video frames from a texture. + /// Unlike the standard `TextureInput` family, this class serves primarily as a + /// way to connect Unity's `Update` loop to a texture input's `CommitFrame` method. + /// + internal sealed class CustomTextureInput : IDisposable { + + #region --Client API-- + /// + /// Number of successive camera frames to skip while recording. + /// This is very useful for GIF recording, which typically has a lower framerate appearance. + /// + public int frameSkip; + + /// + /// Create a custom texture input. + /// + /// Media recorder to receive video frames. + /// Clock for generating timestamps. + /// Texture to record from. + public CustomTextureInput (MediaRecorder recorder, IClock clock, Texture texture) : this(TextureInput.CreateDefault(recorder), clock, texture) { } + + /// + /// Create a custom texture input. + /// + /// Media recorder to receive video frames. + /// Texture to record from. + public CustomTextureInput (MediaRecorder recorder, Texture texture) : this(recorder, default, texture) { } + + /// + /// Create a custom texture input. + /// + /// Texture input to receive video frames. + /// Clock for generating timestamps. + /// Texture to record from. + public CustomTextureInput (TextureInput input, IClock clock, Texture texture) { + this.input = input; + this.clock = clock; + this.texture = texture; + this.attachment = new GameObject(@"VideoKitCustomTextureInputAttachment").AddComponent(); + this.frameIdx = 0; + // Start recording + attachment.StartCoroutine(CommitFrames()); + } + + /// + /// Create a custom texture input. + /// + /// Texture input to receive video frames. + /// Texture to record from. + public CustomTextureInput (TextureInput input, Texture texture) : this(input, default, texture) { } + + /// + /// Stop recorder input and release resources. + /// + public void Dispose () { + GameObject.DestroyImmediate(attachment.gameObject); + input.Dispose(); + } + #endregion + + + #region --Operations-- + private readonly TextureInput input; + private readonly IClock clock; + private readonly Texture texture; + private readonly CustomTextureInputAttachment attachment; + private int frameIdx; + + private IEnumerator CommitFrames () { + var yielder = new WaitForEndOfFrame(); + for (;;) { + // Check frame index + yield return yielder; + if (frameIdx++ % (frameSkip + 1) != 0) + continue; + // Commit + input.CommitFrame(texture, clock?.timestamp ?? 0L); + } + } + + private sealed class CustomTextureInputAttachment : MonoBehaviour { } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/CustomTextureInput.cs.meta b/Runtime/Inputs/CustomTextureInput.cs.meta new file mode 100644 index 0000000..dbebb71 --- /dev/null +++ b/Runtime/Inputs/CustomTextureInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: fbd53668dc5734ae3a23c0e3a4123ce6 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/GLESTextureInput.cs b/Runtime/Inputs/GLESTextureInput.cs new file mode 100644 index 0000000..ffedd71 --- /dev/null +++ b/Runtime/Inputs/GLESTextureInput.cs @@ -0,0 +1,136 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using AOT; + using System; + using System.Runtime.CompilerServices; + using System.Runtime.InteropServices; + using UnityEngine; + using UnityEngine.Rendering; + using Internal; + + /// + /// Recorder input for recording video frames from textures with hardware acceleration on Android OpenGL ES3. + /// + public sealed class GLESTextureInput : TextureInput { + + #region --Client API-- + /// + /// Create a GLES texture input. + /// + /// Media recorder to receive video frames. + public GLESTextureInput (MediaRecorder recorder) : base(recorder) { + // Check platform + if (Application.platform != RuntimePlatform.Android) + throw new InvalidOperationException(@"GLESTextureInput can only be used on Android"); + // Check render API + if (SystemInfo.graphicsDeviceType != GraphicsDeviceType.OpenGLES3) + throw new InvalidOperationException(@"GLESTextureInput can only be used with OpenGL ES3"); + // Initialize + var (width, height) = recorder.frameSize; + this.frameBuffer = new RenderTexture(width, height, 0, RenderTextureFormat.ARGB32); + this.frameBuffer.Create(); + this.frameBufferID = frameBuffer.GetNativeTexturePtr(); + this.fence = new object(); + // Create native input + VideoKitExt.CreateTexutreInput(width, height, OnReadbackCompleted, out input); + } + + /// + /// Commit a video frame from a texture. + /// + /// Source texture. + /// Frame timestamp in nanoseconds. + public override unsafe void CommitFrame (Texture texture, long timestamp) { + // Check + lock (fence) + if (input == default) + return; + // Create command buffer + Action callback = pixelBuffer => recorder.CommitFrame((void*)pixelBuffer, timestamp); + var handle = GCHandle.Alloc(callback, GCHandleType.Normal); + var commandBuffer = new CommandBuffer(); + commandBuffer.name = @"GLESTextureInput"; + commandBuffer.Blit(texture, frameBuffer); + RunOnRenderThread(commandBuffer, () => { + lock (fence) + input.CommitFrame(frameBufferID, (IntPtr)handle); + }); + // Execute + Graphics.ExecuteCommandBuffer(commandBuffer); + } + + /// + /// Stop recorder input and release resources. + /// + public override void Dispose () { + lock (fence) { + frameBuffer.Release(); + input.ReleaseTextureInput(); + input = default; + base.Dispose(); + } + } + #endregion + + + #region --Operations-- + private readonly RenderTexture frameBuffer; + private readonly IntPtr frameBufferID; + private IntPtr input; + private readonly object fence; + private static readonly IntPtr RenderThreadCallback; + + static GLESTextureInput () => RenderThreadCallback = Marshal.GetFunctionPointerForDelegate(OnRenderThreadInvoke); + + /// + /// Register a delegate to be invoked on the Unity render thread. + /// + /// Command buffer. + /// Delegate to invoke on the Unity render thread. + private static void RunOnRenderThread (CommandBuffer commandBuffer, Action action) { + var handle = GCHandle.Alloc(action, GCHandleType.Normal); + commandBuffer.IssuePluginEventAndData(RenderThreadCallback, default, (IntPtr)handle); + } + + [MonoPInvokeCallback(typeof(UnityRenderingEventAndData))] + private static void OnRenderThreadInvoke (int _, IntPtr context) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + try { + // Invoke + var handle = (GCHandle)context; + var action = handle.Target as Action; + handle.Free(); + action?.Invoke(); + } catch (Exception ex) { + Debug.LogException(ex); + } + } + + [MonoPInvokeCallback(typeof(VideoKitExt.ReadbackHandler))] + private static void OnReadbackCompleted (IntPtr context, IntPtr pixelBuffer) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + try { + // Invoke + var handle = (GCHandle)context; + var handler = handle.Target as Action; + handle.Free(); + handler?.Invoke(pixelBuffer); + } catch (Exception ex) { + Debug.LogException(ex); + } + } + + [UnmanagedFunctionPointer(CallingConvention.Cdecl)] + private delegate void UnityRenderingEventAndData (int _, IntPtr data); + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/GLESTextureInput.cs.meta b/Runtime/Inputs/GLESTextureInput.cs.meta new file mode 100644 index 0000000..52dde4c --- /dev/null +++ b/Runtime/Inputs/GLESTextureInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 8366bb74bc68146d2aca4464fbb34ac3 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/ITextureInput.cs b/Runtime/Inputs/ITextureInput.cs new file mode 100644 index 0000000..96e3d64 --- /dev/null +++ b/Runtime/Inputs/ITextureInput.cs @@ -0,0 +1,14 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using UnityEngine; + + internal interface ITextureInput { + + void CommitFrame (Texture source, RenderTexture destination); + } +} \ No newline at end of file diff --git a/Runtime/Inputs/ITextureInput.cs.meta b/Runtime/Inputs/ITextureInput.cs.meta new file mode 100644 index 0000000..714291a --- /dev/null +++ b/Runtime/Inputs/ITextureInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: b52784345d168460e82fed137dcf66d4 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/RecorderTextureInput.cs b/Runtime/Inputs/RecorderTextureInput.cs new file mode 100644 index 0000000..a5f2c9c --- /dev/null +++ b/Runtime/Inputs/RecorderTextureInput.cs @@ -0,0 +1,65 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using UnityEngine; + + /// + /// Texture input that supports stacking other texture inputs. + /// + internal sealed class RecorderTextureInput : TextureInput { + + #region --Client API-- + /// + /// Texture inputs. + /// + public readonly ITextureInput[] inputs; + + /// + /// Create a recorder texture input. + /// + /// Media recorder to receive video frames. + /// Inputs to apply to video frames. + public RecorderTextureInput (MediaRecorder recorder, params ITextureInput[] inputs) : base(recorder) { + this.input = CreateDefault(recorder); + this.inputs = inputs; + } + + /// + /// Commit a video frame from a texture. + /// + /// Source texture. + /// Frame timestamp in nanoseconds. + public override void CommitFrame (Texture texture, long timestamp) { + var (width, height) = frameSize; + var descriptor = new RenderTextureDescriptor(width, height, RenderTextureFormat.ARGB32, 0); + var result = RenderTexture.GetTemporary(descriptor); + Graphics.Blit(texture, result); + foreach (var textureInput in inputs) { + var source = RenderTexture.GetTemporary(descriptor); + Graphics.Blit(result, source); + textureInput.CommitFrame(source, result); + RenderTexture.ReleaseTemporary(source); + } + input.CommitFrame(result, timestamp); + RenderTexture.ReleaseTemporary(result); + } + + /// + /// Stop the recorder input and release resources. + /// + public override void Dispose () { + input.Dispose(); + base.Dispose(); + } + #endregion + + + #region --Operations-- + private readonly TextureInput input; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/RecorderTextureInput.cs.meta b/Runtime/Inputs/RecorderTextureInput.cs.meta new file mode 100644 index 0000000..9e1c500 --- /dev/null +++ b/Runtime/Inputs/RecorderTextureInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: c5f296db4fdef4f4e895be7e7fd8a5bb +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/ScreenInput.cs b/Runtime/Inputs/ScreenInput.cs new file mode 100644 index 0000000..05d8081 --- /dev/null +++ b/Runtime/Inputs/ScreenInput.cs @@ -0,0 +1,92 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using System; + using System.Collections; + using UnityEngine; + using Clocks; + + /// + /// Recorder input for recording video frames from the screen. + /// Unlike the `CameraInput`, this recorder input is able to record overlay UI canvases. + /// + public sealed class ScreenInput : IDisposable { + + #region --Client API-- + /// + /// Control number of successive camera frames to skip while recording. + /// This is very useful for GIF recording, which typically has a lower framerate appearance. + /// + public int frameSkip; + + /// + /// Create a video recording input from the screen. + /// + /// Media recorder to receive video frames. + /// Recording clock for generating timestamps. + public ScreenInput (MediaRecorder recorder, IClock clock = default) : this( + TextureInput.CreateDefault(recorder), + clock + ) { } + + /// + /// Create a video recording input from the screen. + /// + /// Texture input to receive video frames. + /// Recording clock for generating timestamps. + public ScreenInput (TextureInput input, IClock clock = default) { + this.input = input; + this.clock = clock; + this.descriptor = new RenderTextureDescriptor(input.frameSize.width, input.frameSize.height, RenderTextureFormat.ARGBHalf, 0); + // Start recording + attachment = new GameObject(@"NatCorder ScreenInputAttachment").AddComponent(); + attachment.StartCoroutine(CommitFrames()); + } + + /// + /// Stop recorder input and release resources. + /// + public void Dispose () { + GameObject.DestroyImmediate(attachment.gameObject); + input.Dispose(); + } + #endregion + + + #region --Operations-- + private readonly TextureInput input; + private readonly IClock clock; + private readonly RenderTextureDescriptor descriptor; + private readonly ScreenInputAttachment attachment; + private int frameCount; + + private IEnumerator CommitFrames () { + var yielder = new WaitForEndOfFrame(); + for (;;) { + // Check frame index + yield return yielder; + if (frameCount++ % (frameSkip + 1) != 0) + continue; + // Capture screen + var frameBuffer = RenderTexture.GetTemporary(descriptor); + if (SystemInfo.graphicsUVStartsAtTop) { + var tempBuffer = RenderTexture.GetTemporary(descriptor); + ScreenCapture.CaptureScreenshotIntoRenderTexture(tempBuffer); + Graphics.Blit(tempBuffer, frameBuffer, new Vector2(1, -1), Vector2.up); + RenderTexture.ReleaseTemporary(tempBuffer); + } else + ScreenCapture.CaptureScreenshotIntoRenderTexture(frameBuffer); + // Commit + input.CommitFrame(frameBuffer, clock?.timestamp ?? 0L); + RenderTexture.ReleaseTemporary(frameBuffer); + } + } + + private sealed class ScreenInputAttachment : MonoBehaviour { } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/ScreenInput.cs.meta b/Runtime/Inputs/ScreenInput.cs.meta new file mode 100644 index 0000000..e94644f --- /dev/null +++ b/Runtime/Inputs/ScreenInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 595b78f8849ac42b18ee7666560f8dca +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/TextureInput.cs b/Runtime/Inputs/TextureInput.cs new file mode 100644 index 0000000..0ef961a --- /dev/null +++ b/Runtime/Inputs/TextureInput.cs @@ -0,0 +1,87 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using System; + using UnityEngine; + using UnityEngine.Rendering; + using Unity.Collections.LowLevel.Unsafe; + + /// + /// Recorder input for recording video frames from textures. + /// Textures will be recorded by performing a synchronous pixel buffer readback. + /// + public class TextureInput : IDisposable { + + #region --Client API-- + /// + /// Texture input frame size. + /// Note that it is not required that committed textures have the frame size, though it is recommended. + /// + public virtual (int width, int height) frameSize => recorder.frameSize; + + /// + /// When an Android app renders with OpenGL ES3, the default `TextureInput` implementation can be very expensive. + /// Enabling this flag will use the custom `GLESTextureInput` to accelerate pixel buffer readbacks from the GPU. + /// Apps that use ARFoundation will see recording performance greatly benefit from enabling this flag. + /// This flag only has an effect on Android when rendering with OpenGL ES3. + /// This flag defaults to `true`. + /// + public static bool UseGLESTextureInput = true; + + /// + /// Create a texture input which performs synchronous readbacks. + /// + /// Media recorder to receive video frames. + public TextureInput (MediaRecorder recorder) => this.recorder = recorder; + + /// + /// Commit a video frame from a texture. + /// + /// Source texture. + /// Frame timestamp in nanoseconds. + public virtual void CommitFrame (Texture texture, long timestamp) { + // Blit + var (width, height) = recorder.frameSize; + var renderTexture = RenderTexture.GetTemporary(width, height, 24, RenderTextureFormat.ARGB32); + Graphics.Blit(texture, renderTexture); + // Readback // Completely kills performance + var prevActive = RenderTexture.active; + RenderTexture.active = renderTexture; + readbackBuffer = readbackBuffer ? readbackBuffer : new Texture2D(width, height, TextureFormat.RGBA32, false); + readbackBuffer.ReadPixels(new Rect(0, 0, width, height), 0, 0, false); + RenderTexture.active = prevActive; + RenderTexture.ReleaseTemporary(renderTexture); + // Commit + recorder.CommitFrame(readbackBuffer.GetRawTextureData(), timestamp); + } + + /// + /// Stop recorder input and release resources. + /// + public virtual void Dispose () => Texture2D.Destroy(readbackBuffer); + + /// + /// Create the platform default texture input for the given recorder. + /// + /// + /// Create texture input. + public static TextureInput CreateDefault (MediaRecorder recorder) => Application.platform switch { + RuntimePlatform.Android when AllowGLESDefault => new GLESTextureInput(recorder), + RuntimePlatform.WebGLPlayer => new TextureInput(recorder), + _ when SystemInfo.supportsAsyncGPUReadback => new AsyncTextureInput(recorder), + _ => new TextureInput(recorder), + }; + #endregion + + + #region --Operations-- + protected readonly MediaRecorder recorder; + private Texture2D readbackBuffer; + private static bool AllowGLESDefault => SystemInfo.graphicsDeviceType == GraphicsDeviceType.OpenGLES3 && UseGLESTextureInput; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/TextureInput.cs.meta b/Runtime/Inputs/TextureInput.cs.meta new file mode 100644 index 0000000..943a319 --- /dev/null +++ b/Runtime/Inputs/TextureInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 9e38a4378b52d46928be2f1d614c6ecc +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Inputs/WatermarkTextureInput.cs b/Runtime/Inputs/WatermarkTextureInput.cs new file mode 100644 index 0000000..31dda4b --- /dev/null +++ b/Runtime/Inputs/WatermarkTextureInput.cs @@ -0,0 +1,86 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Recorders.Inputs { + + using UnityEngine; + + /// + /// Recorder input for recording video frames from textures with a watermark. + /// + public sealed class WatermarkTextureInput : TextureInput, ITextureInput { + + #region --Client API-- + /// + public override (int width, int height) frameSize => input?.frameSize ?? (0, 0); + + /// + /// Watermark image. + /// If `null`, no watermark will be rendered. + /// + public Texture watermark; + + /// + /// Watermark display rect in pixel coordinates of the recorder. + /// + public RectInt rect; + + /// + /// Create a watermark texture input. + /// + /// Media recorder to receive watermarked frames. + public WatermarkTextureInput (MediaRecorder recorder) : this(CreateDefault(recorder)) { } + + /// + /// Create a watermark texture input. + /// + /// Texture input to receive watermarked frames. + public WatermarkTextureInput (TextureInput input) : base(null) => this.input = input; + + /// + /// Commit a video frame from a texture. + /// + /// Source texture. + /// Frame timestamp in nanoseconds. + public override void CommitFrame (Texture texture, long timestamp) { + // Create frame buffer + var (width, height) = frameSize; + var descriptor = new RenderTextureDescriptor(width, height, RenderTextureFormat.ARGB32, 0); + var frameBuffer = RenderTexture.GetTemporary(descriptor); + // Render + (this as ITextureInput).CommitFrame(texture, frameBuffer); + // Commit + input.CommitFrame(frameBuffer, timestamp); + RenderTexture.ReleaseTemporary(frameBuffer); + } + + /// + /// Stop the recorder input and release resources. + /// + public override void Dispose () { + input.Dispose(); + base.Dispose(); + } + #endregion + + + #region --Operations-- + private readonly TextureInput input; + + void ITextureInput.CommitFrame (Texture source, RenderTexture destination) { + var drawRect = new Rect(rect.x, destination.height - rect.max.y, rect.width, rect.height); + var prevActive = RenderTexture.active; + RenderTexture.active = destination; + GL.PushMatrix(); + GL.LoadPixelMatrix(0, destination.width, destination.height, 0); + Graphics.Blit(source, destination); + if (watermark) + Graphics.DrawTexture(drawRect, watermark); + GL.PopMatrix(); + RenderTexture.active = prevActive; + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Inputs/WatermarkTextureInput.cs.meta b/Runtime/Inputs/WatermarkTextureInput.cs.meta new file mode 100644 index 0000000..b3d120e --- /dev/null +++ b/Runtime/Inputs/WatermarkTextureInput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 13ee4266f456e4246b4b5f9099b5bd8c +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Internal.meta b/Runtime/Internal.meta new file mode 100644 index 0000000..a82bbd4 --- /dev/null +++ b/Runtime/Internal.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 26b3d72deb06541df8c8eb37904af475 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Internal/VideoKit.cs b/Runtime/Internal/VideoKit.cs new file mode 100644 index 0000000..a1480e4 --- /dev/null +++ b/Runtime/Internal/VideoKit.cs @@ -0,0 +1,651 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Internal { + + using System; + using System.Runtime.InteropServices; + using System.Text; + using UnityEngine; + using Devices; + + public static class VideoKit { + + public const string Assembly = + #if (UNITY_IOS || UNITY_WEBGL) && !UNITY_EDITOR + @"__Internal"; + #else + @"VideoKit"; + #endif + + + #region --Enumerations-- + public enum AssetType : int { + Unknown = 0, + Image = 1, + Audio = 2, + Video = 3, + } + + [Flags] + public enum DeviceFlags : int { + // MediaDevice + Internal = 1 << 0, + External = 1 << 1, + Default = 1 << 3, + // AudioDevice + EchoCancellation = 1 << 2, + // CameraDevice + FrontFacing = 1 << 6, + Flash = 1 << 7, + Torch = 1 << 8, + Depth = 1 << 15, + // CameraDevice.Exposure + ExposureContinuous = 1 << 16, + ExposureLock = 1 << 11, + ExposureManual = 1 << 14, + ExposurePoint = 1 << 9, + // CameraDevice.Focus + FocusContinuous = 1 << 17, + FocusLock = 1 << 12, + FocusPoint = 1 << 10, + // CameraDevice.WhiteBalance + WhiteBalanceContinuous = 1 << 18, + WhiteBalanceLock = 1 << 13, + // CameraDevice.VideoStabilization + VideoStabilization = 1 << 19, + } + + public enum Metadata : int { + IntrinsicMatrix = 1, + ExposureBias = 2, + ExposureDuration = 3, + FocalLength = 4, + FNumber = 5, + Brightness = 6, + ISO = 7, + } + + public enum PermissionType : int { + Microphone = 1, + Camera = 2 + } + + public enum Status : int { + Ok = 0, + InvalidArgument = 1, + InvalidOperation = 2, + NotImplemented = 3, + InvalidSession = 101, + InvalidPlan = 104, + LimitedPlan = 105, + } + #endregion + + + #region --Delegates-- + public delegate void RecordingHandler (IntPtr context, IntPtr path); + public delegate void DeviceDiscoveryHandler (IntPtr context, IntPtr devices, int count); + public delegate void SampleBufferHandler (IntPtr context, IntPtr sampleBuffer); + public delegate void DeviceDisconnectHandler (IntPtr context); + public delegate void PermissionResultHandler (IntPtr context, MediaDevice.PermissionStatus result); + public delegate void AssetLoadHandler (IntPtr context, IntPtr path, AssetType type, int width, int height, float frameRate, int sampleRate, int channelCount, float duration); + public delegate void AssetShareHandler (IntPtr context, IntPtr receiver); + #endregion + + + #region --VKTSession-- + [DllImport(Assembly, EntryPoint = @"VKTGetBundleIdentifier")] + public static extern Status BundleIdentifier ( + [MarshalAs(UnmanagedType.LPStr)] StringBuilder dest + ); + + [DllImport(Assembly, EntryPoint = @"VKTGetSessionStatus")] + public static extern Status SessionStatus (); + + [DllImport(Assembly, EntryPoint = @"VKTSetSessionToken")] + public static extern Status SetSessionToken ( + [MarshalAs(UnmanagedType.LPUTF8Str)] string? token + ); + #endregion + + + #region --VKTAsset-- + [DllImport(Assembly, EntryPoint = @"VKTAssetLoad")] + public static extern Status LoadAsset ( + [MarshalAs(UnmanagedType.LPUTF8Str)] string path, + AssetLoadHandler handler, + IntPtr context + ); + + [DllImport(Assembly, EntryPoint = @"VKTAssetLoadFromCameraRoll")] + public static extern Status LoadAssetFromCameraRoll ( + AssetType type, + AssetLoadHandler handler, + IntPtr context + ); + + [DllImport(Assembly, EntryPoint = @"VKTAssetShare")] + public static extern Status ShareAsset ( + [MarshalAs(UnmanagedType.LPUTF8Str)] string path, + [MarshalAs(UnmanagedType.LPUTF8Str)] string message, + AssetShareHandler handler, + IntPtr context + ); + + [DllImport(Assembly, EntryPoint = @"VKTAssetSaveToCameraRoll")] + public static extern Status SaveAssetToCameraRoll ( + [MarshalAs(UnmanagedType.LPUTF8Str)] string path, + [MarshalAs(UnmanagedType.LPUTF8Str)] string album, + AssetShareHandler handler, + IntPtr context + ); + #endregion + + + #region --VKTRecorder-- + [DllImport(Assembly, EntryPoint = @"VKTRecorderGetFrameSize")] + public static extern Status FrameSize ( + this IntPtr recorder, + out int width, + out int height + ); + + [DllImport(Assembly, EntryPoint = @"VKTRecorderCommitFrame")] + public static extern unsafe Status CommitFrame ( + this IntPtr recorder, + void* pixelBuffer, + long timestamp + ); + + [DllImport(Assembly, EntryPoint = @"VKTRecorderCommitSamples")] + public static extern unsafe Status CommitSamples ( + this IntPtr recorder, + float* sampleBuffer, + int sampleCount, + long timestamp + ); + + [DllImport(Assembly, EntryPoint = @"VKTRecorderFinishWriting")] + public static extern Status FinishWriting ( + this IntPtr recorder, + RecordingHandler handler, + IntPtr context + ); + + [DllImport(Assembly, EntryPoint = @"VKTRecorderCreateMP4")] + public static extern Status CreateMP4Recorder ( + [MarshalAs(UnmanagedType.LPUTF8Str)] string path, + int width, + int height, + float frameRate, + int sampleRate, + int channelCount, + int videoBitrate, + int keyframeInterval, + int audioBitRate, + out IntPtr recorder + ); + + [DllImport(Assembly, EntryPoint = @"VKTRecorderCreateHEVC")] + public static extern Status CreateHEVCRecorder ( + [MarshalAs(UnmanagedType.LPUTF8Str)] string path, + int width, + int height, + float frameRate, + int sampleRate, + int channelCount, + int videoBitRate, + int keyframeInterval, + int audioBitRate, + out IntPtr recorder + ); + + [DllImport(Assembly, EntryPoint = @"VKTRecorderCreateGIF")] + public static extern Status CreateGIFRecorder ( + [MarshalAs(UnmanagedType.LPUTF8Str)] string path, + int width, + int height, + float delay, + out IntPtr recorder + ); + + [DllImport(Assembly, EntryPoint = @"VKTRecorderCreateWAV")] + public static extern Status CreateWAVRecorder ( + [MarshalAs(UnmanagedType.LPUTF8Str)] string path, + int sampleRate, + int channelCount, + out IntPtr recorder + ); + + [DllImport(Assembly, EntryPoint = @"VKTRecorderCreateWEBM")] + public static extern Status CreateWEBMRecorder ( + [MarshalAs(UnmanagedType.LPUTF8Str)] string path, + int width, + int height, + float frameRate, + int sampleRate, + int channelCount, + int videoBitRate, + int keyframeInterval, + int audioBitRate, + out IntPtr recorder + ); + + [DllImport(Assembly, EntryPoint = @"VKTRecorderCreateJPEG")] + public static extern Status CreateJPEGRecorder ( + [MarshalAs(UnmanagedType.LPUTF8Str)] string path, + int width, + int height, + float quality, + out IntPtr recorder + ); + #endregion + + + #region --VKTDevice-- + [DllImport(Assembly, EntryPoint = @"VKTDeviceRelease")] + public static extern Status ReleaseDevice (this IntPtr device); + + [DllImport(Assembly, EntryPoint = @"VKTDeviceGetUniqueID")] + public static extern Status UniqueID ( + this IntPtr device, + [MarshalAs(UnmanagedType.LPStr)] StringBuilder dest + ); + + [DllImport(Assembly, EntryPoint = @"VKTDeviceGetName")] + public static extern Status Name ( + this IntPtr device, + [MarshalAs(UnmanagedType.LPStr)] StringBuilder dest + ); + + [DllImport(Assembly, EntryPoint = @"VKTDeviceGetFlags")] + public static extern DeviceFlags Flags (this IntPtr device); + + [DllImport(Assembly, EntryPoint = @"VKTDeviceIsRunning")] + public static extern bool Running (this IntPtr device); + + [DllImport(Assembly, EntryPoint = @"VKTDeviceStartRunning")] + public static extern Status StartRunning ( + this IntPtr device, + SampleBufferHandler callback, + IntPtr context + ); + + [DllImport(Assembly, EntryPoint = @"VKTDeviceStopRunning")] + public static extern Status StopRunning (this IntPtr device); + + [DllImport(Assembly, EntryPoint = @"VKTDeviceSetDisconnectHandler")] + public static extern Status SetDisconnectHandler ( + this IntPtr device, + DeviceDisconnectHandler handler, + IntPtr context + ); + + [DllImport(Assembly, EntryPoint = @"VKTDeviceCheckPermissions")] + public static extern Status CheckPermissions ( + PermissionType type, + bool request, + PermissionResultHandler handler, + IntPtr context + ); + #endregion + + + #region --VKTMicrophone-- + [DllImport(Assembly, EntryPoint = @"VKTDiscoverMicrophones")] + public static extern Status DiscoverMicrophones ( + DeviceDiscoveryHandler handler, + IntPtr context + ); + + [DllImport(Assembly, EntryPoint = @"VKTMicrophoneGetEchoCancellation")] + public static extern bool EchoCancellation (this IntPtr microphone); + + [DllImport(Assembly, EntryPoint = @"VKTMicrophoneSetEchoCancellation")] + public static extern void SetEchoCancellation ( + this IntPtr microphone, + bool mode + ); + + [DllImport(Assembly, EntryPoint = @"VKTMicrophoneGetSampleRate")] + public static extern int SampleRate (this IntPtr microphone); + + [DllImport(Assembly, EntryPoint = @"VKTMicrophoneSetSampleRate")] + public static extern void SetSampleRate ( + this IntPtr microphone, + int sampleRate + ); + + [DllImport(Assembly, EntryPoint = @"VKTMicrophoneGetChannelCount")] + public static extern int ChannelCount (this IntPtr microphone); + + [DllImport(Assembly, EntryPoint = @"VKTMicrophoneSetChannelCount")] + public static extern void SetChannelCount ( + this IntPtr microphone, + int sampleRate + ); + + [DllImport(Assembly, EntryPoint = @"VKTAudioBufferGetData")] + public static unsafe extern float* AudioBufferData (this IntPtr audioBuffer); + + [DllImport(Assembly, EntryPoint = @"VKTAudioBufferGetSampleCount")] + public static extern int AudioBufferSampleCount (this IntPtr audioBuffer); + + [DllImport(Assembly, EntryPoint = @"VKTAudioBufferGetSampleRate")] + public static extern int AudioBufferSampleRate (this IntPtr audioBuffer); + + [DllImport(Assembly, EntryPoint = @"VKTAudioBufferGetChannelCount")] + public static extern int AudioBufferChannelCount (this IntPtr audioBuffer); + + [DllImport(Assembly, EntryPoint = @"VKTAudioBufferGetTimestamp")] + public static extern long AudioBufferTimestamp (this IntPtr audioBuffer); + #endregion + + + #region --VKTCamera-- + [DllImport(Assembly, EntryPoint = @"VKTDiscoverCameras")] + public static extern Status DiscoverCameras ( + DeviceDiscoveryHandler handler, + IntPtr context + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetFieldOfView")] + public static extern void FieldOfView ( + this IntPtr camera, + out float x, + out float y + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetExposureBiasRange")] + public static extern void ExposureBiasRange ( + this IntPtr camera, + out float min, + out float max + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetExposureDurationRange")] + public static extern void ExposureDurationRange ( + this IntPtr camera, + out float min, + out float max + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetISORange")] + public static extern void ISORange ( + this IntPtr device, + out float min, + out float max + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetZoomRange")] + public static extern void ZoomRange ( + this IntPtr camera, + out float min, + out float max + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetPreviewResolution")] + public static extern void PreviewResolution ( + this IntPtr camera, + out int width, + out int height + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetPreviewResolution")] + public static extern void SetPreviewResolution ( + this IntPtr camera, + int width, + int height + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetPhotoResolution")] + public static extern void PhotoResolution ( + this IntPtr camera, + out int width, + out int height + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetPhotoResolution")] + public static extern void SetPhotoResolution ( + this IntPtr camera, + int width, + int height + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetFrameRate")] + public static extern int FrameRate (this IntPtr camera); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetFrameRate")] + public static extern void SetFrameRate ( + this IntPtr camera, + int framerate + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetExposureMode")] + public static extern CameraDevice.ExposureMode ExposureMode (this IntPtr camera); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetExposureMode")] + public static extern void SetExposureMode ( + this IntPtr camera, + CameraDevice.ExposureMode mode + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetExposureBias")] + public static extern float ExposureBias (this IntPtr camera); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetExposureBias")] + public static extern void SetExposureBias ( + this IntPtr camera, + float bias + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetExposureDuration")] + public static extern float ExposureDuration (this IntPtr camera); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetISO")] + public static extern float ISO (this IntPtr camera); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetExposureDuration")] + public static extern void SetExposureDuration ( + this IntPtr camera, + float duration, + float ISO + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetExposurePoint")] + public static extern void SetExposurePoint ( + this IntPtr camera, + float x, + float y + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetFlashMode")] + public static extern CameraDevice.FlashMode FlashMode (this IntPtr camera); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetFlashMode")] + public static extern void SetFlashMode ( + this IntPtr camera, + CameraDevice.FlashMode mode + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetFocusMode")] + public static extern CameraDevice.FocusMode FocusMode (this IntPtr camera); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetFocusMode")] + public static extern void SetFocusMode ( + this IntPtr camera, + CameraDevice.FocusMode mode + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetFocusPoint")] + public static extern void SetFocusPoint ( + this IntPtr camera, + float x, + float y + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetTorchMode")] + public static extern CameraDevice.TorchMode TorchMode (this IntPtr camera); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetTorchMode")] + public static extern void SetTorchMode ( + this IntPtr camera, + CameraDevice.TorchMode mode + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetWhiteBalanceMode")] + public static extern CameraDevice.WhiteBalanceMode WhiteBalanceMode (this IntPtr camera); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetWhiteBalanceMode")] + public static extern void SetWhiteBalanceMode ( + this IntPtr camera, + CameraDevice.WhiteBalanceMode mode + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetVideoStabilizationMode")] + public static extern CameraDevice.VideoStabilizationMode VideoStabilizationMode (this IntPtr camera); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetVideoStabilizationMode")] + public static extern void SetVideoStabilizationMode ( + this IntPtr camera, + CameraDevice.VideoStabilizationMode mode + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraGetZoomRatio")] + public static extern float ZoomRatio (this IntPtr camera); + + [DllImport(Assembly, EntryPoint = @"VKTCameraSetZoomRatio")] + public static extern void SetZoomRatio ( + this IntPtr camera, + float ratio + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraCapturePhoto")] + public static extern void CapturePhoto ( + this IntPtr camera, + SampleBufferHandler handler, + IntPtr context + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetData")] + public static unsafe extern void* CameraImageData (this IntPtr image); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetDataSize")] + public static extern int CameraImageDataSize (this IntPtr image); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetFormat")] + public static extern CameraImage.Format CameraImageFormat (this IntPtr image); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetWidth")] + public static extern int CameraImageWidth (this IntPtr image); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetHeight")] + public static extern int CameraImageHeight (this IntPtr image); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetRowStride")] + public static extern int CameraImageRowStride (this IntPtr image); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetTimestamp")] + public static extern long CameraImageTimestamp (this IntPtr image); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetVerticallyMirrored")] + public static extern bool CameraImageVerticallyMirrored (this IntPtr image); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetPlaneCount")] + public static extern int CameraImagePlaneCount (this IntPtr image); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetPlaneData")] + public static unsafe extern void* CameraImagePlaneData ( + this IntPtr image, + int planeIdx + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetPlaneDataSize")] + public static extern int CameraImagePlaneDataSize ( + this IntPtr image, + int planeIdx + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetPlaneWidth")] + public static extern int CameraImagePlaneWidth ( + this IntPtr image, + int planeIdx + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetPlaneHeight")] + public static extern int CameraImagePlaneHeight ( + this IntPtr image, + int planeIdx + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetPlanePixelStride")] + public static extern int CameraImagePlanePixelStride ( + this IntPtr image, + int planeIdx + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetPlaneRowStride")] + public static extern int CameraImagePlaneRowStride ( + this IntPtr image, + int planeIdx + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageGetMetadata")] + public static unsafe extern bool CameraImageMetadata ( + this IntPtr image, + Metadata key, + float* value, + int count = 1 + ); + + [DllImport(Assembly, EntryPoint = @"VKTCameraImageConvertToRGBA8888")] + public static unsafe extern void ConvertToRGBA8888 ( + this IntPtr image, + int orientation, + bool mirror, + void* tempBuffer, + void* dstBuffer, + out int dstWidth, + out int dstHeight + ); + #endregion + + + #region --Utility-- + + public static bool IsAppDomainLoaded { // thanks @UnityAlex! + get { + #if ENABLE_MONO && UNITY_EDITOR + var domain = mono_domain_get(); + var unloading = mono_domain_is_unloading(domain); + return !unloading; + [DllImport(@"__Internal")] + static extern IntPtr mono_domain_get (); + [DllImport(@"__Internal")] [return: MarshalAs(UnmanagedType.I4)] + static extern bool mono_domain_is_unloading (IntPtr domain); + #else + return true; + #endif + } + } + + public static void CheckStatus (this Status status) { + switch (status) { + case Status.Ok: break; + case Status.InvalidArgument: throw new ArgumentException(); + case Status.InvalidOperation: throw new InvalidOperationException(); + case Status.NotImplemented: throw new NotImplementedException(); + case Status.InvalidSession: throw new InvalidOperationException(@"VideoKit session token is invalid. Check your NatML access key."); + case Status.InvalidPlan: throw new InvalidOperationException(@"VideoKit plan does not support this operation. Check your plan and upgrade at https://hub.natml.ai"); + case Status.LimitedPlan: Debug.LogWarning(@"VideoKit plan only allows for limited functionality. Check your plan and upgrade at https://hub.natml.ai"); break; + default: throw new InvalidOperationException(); + } + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Internal/VideoKit.cs.meta b/Runtime/Internal/VideoKit.cs.meta new file mode 100644 index 0000000..8625c05 --- /dev/null +++ b/Runtime/Internal/VideoKit.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 1d12a96f7ad904098b02d5ba0df4059e +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Internal/VideoKitClient.cs b/Runtime/Internal/VideoKitClient.cs new file mode 100644 index 0000000..ac3280c --- /dev/null +++ b/Runtime/Internal/VideoKitClient.cs @@ -0,0 +1,131 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Internal { + + using System; + using System.Collections.Generic; + using System.Net.Http; + using System.Net.Http.Headers; + using System.Text; + using System.Threading.Tasks; + using UnityEngine; + using UnityEngine.Networking; + using Newtonsoft.Json; + + /// + /// VideoKit API client. + /// + public sealed class VideoKitClient { + + #region --Client API-- + /// + /// VideoKit client version. + /// + public const string Version = @"0.0.16"; + + /// + /// VideoKit API URL. + /// + public const string URL = @"https://www.videokit.ai/api"; + + /// + /// Create the VideoKit client. + /// + /// VideoKit access key. + /// VideoKit API URL. + public VideoKitClient (string accessKey, string url = null) { + this.accessKey = accessKey; + this.url = url ?? URL; + } + + /// + /// Create a build token. + /// + /// Build token. + public async Task CreateBuildToken () { + // Create client + using var request = new HttpClient(); + request.DefaultRequestHeaders.Authorization = !string.IsNullOrEmpty(accessKey) ? new AuthenticationHeaderValue(@"Bearer", accessKey) : null; + // Request + using var response = await request.PostAsync(buildUrl, null); + var responseStr = await response.Content.ReadAsStringAsync(); + var result = JsonConvert.DeserializeObject>(responseStr); + // Check error + if (result.TryGetValue(@"error", out var error)) + throw new InvalidOperationException(error); + // Return + return result?["token"]; + } + + /// + /// Create a session token. + /// + /// Session token. + public Task CreateSessionToken ( + string buildToken, + string bundleId, + string platform + ) { + var payload = new Dictionary { + ["build"] = buildToken, + ["bundle"] = bundleId, + ["platform"] = platform, + ["version"] = Version, + }; + var payloadStr = JsonConvert.SerializeObject(payload); + return Application.platform == RuntimePlatform.WebGLPlayer ? CreateSessionTokenUnity(payloadStr) : CreateSessionTokenDotNet(payloadStr); + } + #endregion + + + #region --Operations-- + private readonly string accessKey; + private readonly string url; + private string buildUrl => $"{url}/build"; + private string sessionUrl => $"{url}/session"; + + private async Task CreateSessionTokenDotNet (string payload) { + // Create client + using var client = new HttpClient(); + client.DefaultRequestHeaders.Authorization = !string.IsNullOrEmpty(accessKey) ? new AuthenticationHeaderValue(@"Bearer", accessKey) : null; + // Request + using var content = new StringContent(payload, Encoding.UTF8, @"application/json"); + using var response = await client.PostAsync(sessionUrl, content); + // Parse + var responseStr = await response.Content.ReadAsStringAsync(); + var result = JsonConvert.DeserializeObject>(responseStr); + // Check error + if (result.TryGetValue(@"error", out var error)) + throw new InvalidOperationException(error); + // Return + return result?["token"]; + } + + private async Task CreateSessionTokenUnity (string payload) { + // Generate session token + using var request = new UnityWebRequest(sessionUrl, UnityWebRequest.kHttpVerbPOST) { + uploadHandler = new UploadHandlerRaw(Encoding.UTF8.GetBytes(payload)), + downloadHandler = new DownloadHandlerBuffer(), + disposeDownloadHandlerOnDispose = true, + disposeUploadHandlerOnDispose = true, + }; + request.SetRequestHeader(@"Content-Type", @"application/json"); + request.SetRequestHeader(@"Authorization", $"Bearer {accessKey}"); + request.SendWebRequest(); + while (!request.isDone) + await Task.Yield(); + // Check error + var result = JsonConvert.DeserializeObject>(request.downloadHandler.text); + if (result.TryGetValue(@"error", out var error)) + throw new InvalidOperationException(error); + // Return + return result?["token"]; + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Internal/VideoKitClient.cs.meta b/Runtime/Internal/VideoKitClient.cs.meta new file mode 100644 index 0000000..b66ba2c --- /dev/null +++ b/Runtime/Internal/VideoKitClient.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 4522a5f8f6313492394323b8af393465 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Internal/VideoKitExt.cs b/Runtime/Internal/VideoKitExt.cs new file mode 100644 index 0000000..7c2825e --- /dev/null +++ b/Runtime/Internal/VideoKitExt.cs @@ -0,0 +1,92 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Internal { + + using System; + using System.Runtime.InteropServices; + using System.Text; + using UnityEngine; + using Devices; + using static VideoKit; + + public static class VideoKitExt { + + #region --Delegates-- + public delegate void ReadbackHandler (IntPtr context, IntPtr pixelBuffer); + #endregion + + + #region --GLESTextureInput-- + #if UNITY_ANDROID && !UNITY_EDITOR + [DllImport(VideoKit.Assembly, EntryPoint = @"VKTGLESTextureInputCreate")] + public static extern void CreateTexutreInput ( + int width, + int height, + ReadbackHandler handler, + out IntPtr input + ); + + [DllImport(VideoKit.Assembly, EntryPoint = @"VKTGLESTextureInputCommitFrame")] + public static extern void CommitFrame ( + this IntPtr input, + IntPtr texture, + IntPtr context + ); + + [DllImport(VideoKit.Assembly, EntryPoint = @"VKTGLESTextureInputRelease")] + public static extern void ReleaseTextureInput (this IntPtr input); + #else + + public static void CreateTexutreInput ( + int width, + int height, + ReadbackHandler handler, + out IntPtr input + ) => input = IntPtr.Zero; + + public static void CommitFrame ( + this IntPtr input, + IntPtr texture, + IntPtr context + ) { } + + public static void ReleaseTextureInput (this IntPtr input) { } + #endif + #endregion + + + #region --AudioSession-- + #if UNITY_IOS && !UNITY_EDITOR + [DllImport(Assembly, EntryPoint = @"VKTConfigureAudioSession")] + public static extern void ConfigureAudioSession (); + #else + public static void ConfigureAudioSession () { } + #endif + #endregion + + + #region --IO-- + #if UNITY_WEBGL && !UNITY_EDITOR + [DllImport(Assembly, EntryPoint = @"VKTAssetWriteImage")] + public static extern Status WriteImage ( + byte[] data, + int size, + [MarshalAs(UnmanagedType.LPStr)] StringBuilder path, + int pathLen + ); + #else + public static Status WriteImage ( + byte[] data, + int size, + StringBuilder path, + int pathLen + ) => Status.Ok; + #endif + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Internal/VideoKitExt.cs.meta b/Runtime/Internal/VideoKitExt.cs.meta new file mode 100644 index 0000000..4c53460 --- /dev/null +++ b/Runtime/Internal/VideoKitExt.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: ddcc0eb9415684523a0d7e1f04375ef6 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Internal/VideoKitInfo.cs b/Runtime/Internal/VideoKitInfo.cs new file mode 100644 index 0000000..405ea6b --- /dev/null +++ b/Runtime/Internal/VideoKitInfo.cs @@ -0,0 +1,18 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +using System.Reflection; +using System.Runtime.CompilerServices; +using VideoKit.Internal; + +// Metadata +[assembly: AssemblyCompany(@"NatML Inc")] +[assembly: AssemblyTitle(@"VideoKit")] +[assembly: AssemblyVersionAttribute(VideoKitClient.Version)] +[assembly: AssemblyCopyright(@"Copyright © 2023 NatML Inc. All Rights Reserved.")] + +// Friends +[assembly: InternalsVisibleTo(@"VideoKit.Editor")] +[assembly: InternalsVisibleTo(@"VideoKit.Tests")] \ No newline at end of file diff --git a/Runtime/Internal/VideoKitInfo.cs.meta b/Runtime/Internal/VideoKitInfo.cs.meta new file mode 100644 index 0000000..788886d --- /dev/null +++ b/Runtime/Internal/VideoKitInfo.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: e88b378f2441c416a8c496eb5f088802 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Internal/VideoKitSettings.cs b/Runtime/Internal/VideoKitSettings.cs new file mode 100644 index 0000000..663e525 --- /dev/null +++ b/Runtime/Internal/VideoKitSettings.cs @@ -0,0 +1,120 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit.Internal { + + using System; + using System.Text; + using System.Threading.Tasks; + using UnityEngine; + using Function; + using NatML; + using NatML.API; + using NatML.Internal; + using Status = VideoKit.Status; + + /// + /// VideoKit settings. + /// + [DefaultExecutionOrder(-10_000)] + public sealed class VideoKitSettings : ScriptableObject { + + #region --Client API-- + /// + /// VideoKit API client. + /// + public VideoKitClient? client => _client = _client ?? new VideoKitClient(accessKey); + + /// + /// VideoKit Function client. + /// + public Function? fxn => _fxn = _fxn ?? (!string.IsNullOrEmpty(accessKey) ? FunctionUnity.Create(accessKey) : null); + + /// + /// VideoKit NatML client. + /// + public NatMLClient? natml => _natml ?? (!string.IsNullOrEmpty(accessKey) ? MLUnityExtensions.CreateClient(accessKey) : null); + + /// + /// VideoKit settings for this project. + /// + public static VideoKitSettings? Instance { get; internal set; } + + /// + /// VideoKit application bundle identifier. + /// + public static string BundleId { + get { + var result = new StringBuilder(2048); + VideoKit.BundleIdentifier(result); + return result.ToString(); + } + } + + /// + /// Check the application VideoKit session status. + /// + public async Task CheckSession () { + // Linux editor + if (Application.platform == RuntimePlatform.LinuxEditor) + return Status.NotImplemented; + // Check + var current = VideoKit.SessionStatus(); + if (current != Status.InvalidSession) + return current; + // Set local token + if (!string.IsNullOrEmpty(sessionToken)) { + var status = VideoKit.SetSessionToken(sessionToken); + if (status != Status.InvalidSession) + return status; + } + // Set session token + var token = await client.CreateSessionToken(buildToken, BundleId, ToPlatform(Application.platform)); + var result = VideoKit.SetSessionToken(token); + // Return + return result; + } + #endregion + + + #region --Operations-- + [SerializeField, HideInInspector] + internal string accessKey = string.Empty; + [SerializeField, HideInInspector] + internal string buildToken = string.Empty; + [SerializeField, HideInInspector] + internal string sessionToken = string.Empty; + private VideoKitClient? _client; + private Function? _fxn; + private NatMLClient? _natml; + internal static string FallbackAccessKey => NatMLSettings.Instance?.accessKey; + + private async void Awake () { + // Set singleton in player + if (Application.isEditor) + return; + // Set singleton + Instance = this; + // Check session + await CheckSession(); + } + + private static string? ToPlatform (RuntimePlatform platform) => platform switch { + RuntimePlatform.Android => @"ANDROID", + RuntimePlatform.IPhonePlayer => @"IOS", + RuntimePlatform.LinuxEditor => @"LINUX", + RuntimePlatform.LinuxPlayer => @"LINUX", + RuntimePlatform.OSXEditor => @"MACOS", + RuntimePlatform.OSXPlayer => @"MACOS", + RuntimePlatform.WebGLPlayer => @"WEB", + RuntimePlatform.WindowsEditor => @"WINDOWS", + RuntimePlatform.WindowsPlayer => @"WINDOWS", + _ => null + }; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Internal/VideoKitSettings.cs.meta b/Runtime/Internal/VideoKitSettings.cs.meta new file mode 100644 index 0000000..57bd9a6 --- /dev/null +++ b/Runtime/Internal/VideoKitSettings.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 2953e2502715b4c6b8ca42a432bdbc35 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/MediaFormat.cs b/Runtime/MediaFormat.cs new file mode 100644 index 0000000..9572876 --- /dev/null +++ b/Runtime/MediaFormat.cs @@ -0,0 +1,50 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit { + + /// + /// Recording format. + /// + public enum MediaFormat : int { + /// + /// MP4 video with H.264 AVC video codec. + /// This format supports recording both video and audio frames. + /// This format is not supported on WebGL. + /// + MP4 = 0, + /// + /// MP4 video with H.265 HEVC video codec. + /// This format has better compression than `MP4`. + /// This format supports recording both video and audio frames. + /// This format is not supported on WebGL. + /// + HEVC = 1, + /// + /// WEBM video. + /// This format support recording both video and audio frames. + /// This is only supported on Android and WebGL. + /// + WEBM = 2, + /// + /// Animated GIF image. + /// This format only supports recording video frames. + /// + GIF = 3, + /// + /// JPEG image sequence. + /// This format only supports recording video frames. + /// This format is not supported on WebGL. + /// + JPEG = 4, + /// + /// Waveform audio. + /// This format only supports recording audio. + /// + WAV = 5, + } +} \ No newline at end of file diff --git a/Runtime/MediaFormat.cs.meta b/Runtime/MediaFormat.cs.meta new file mode 100644 index 0000000..fa95811 --- /dev/null +++ b/Runtime/MediaFormat.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 88791ae657ab14c75af7bb9fb380888f +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: 7728882fe6960415e897bae270b67d4e, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/MediaRecorder.cs b/Runtime/MediaRecorder.cs new file mode 100644 index 0000000..a5dd1df --- /dev/null +++ b/Runtime/MediaRecorder.cs @@ -0,0 +1,342 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit { + + using AOT; + using System; + using System.IO; + using System.Linq; + using System.Runtime.InteropServices; + using System.Threading.Tasks; + using UnityEngine; + using Unity.Collections; + using Unity.Collections.LowLevel.Unsafe; + using Internal; + + /// + /// Media recorder capable of recording video and/or audio frames to a media file. + /// All recorder methods are thread safe, and as such can be called from any thread. + /// + public sealed class MediaRecorder { + + #region --Client API-- + /// + /// Recorder format. + /// + public readonly MediaFormat format; + + /// + /// Recording video frame size. + /// + public (int width, int height) frameSize { + get { + recorder.FrameSize(out var width, out var height).CheckStatus(); + return (width, height); + } + } + + /// + /// Commit a video pixel buffer for encoding. + /// The pixel buffer MUST have an RGBA8888 pixel layout. + /// + /// Pixel buffer to commit. + /// Pixel buffer timestamp in nanoseconds. + public unsafe void CommitFrame (T[] pixelBuffer, long timestamp) where T : unmanaged { + fixed (T* baseAddress = pixelBuffer) + CommitFrame(baseAddress, timestamp); + } + + /// + /// Commit a video pixel buffer for encoding. + /// The pixel buffer MUST have an RGBA8888 pixel layout. + /// + /// Pixel buffer to commit. + /// Pixel buffer timestamp in nanoseconds. + public unsafe void CommitFrame (NativeArray pixelBuffer, long timestamp) where T : unmanaged => CommitFrame( + pixelBuffer.GetUnsafeReadOnlyPtr(), + timestamp + ); + + /// + /// Commit a video pixel buffer for encoding. + /// The pixel buffer MUST have an RGBA8888 pixel layout. + /// + /// Pixel buffer to commit. + /// Pixel buffer timestamp in nanoseconds. + public unsafe void CommitFrame (void* pixelBuffer, long timestamp) => recorder.CommitFrame( + pixelBuffer, + timestamp + ).CheckStatus(); + + /// + /// Commit an audio sample buffer for encoding. + /// The sample buffer MUST be a linear PCM buffer interleaved by channel. + /// + /// Sample buffer to commit. + /// Sample buffer timestamp in nanoseconds. + public unsafe void CommitSamples (float[] sampleBuffer, long timestamp) { + fixed (float* baseAddress = sampleBuffer) + CommitSamples(baseAddress, sampleBuffer.Length, timestamp); + } + + /// + /// Commit an audio sample buffer for encoding. + /// The sample buffer MUST be a linear PCM buffer interleaved by channel. + /// + /// Sample buffer to commit. + /// Sample buffer timestamp in nanoseconds. + public unsafe void CommitSamples (NativeArray sampleBuffer, long timestamp) => CommitSamples( + (float*)sampleBuffer.GetUnsafeReadOnlyPtr(), + sampleBuffer.Length, + timestamp + ); + + /// + /// Commit an audio sample buffer for encoding. + /// The sample buffer MUST be a linear PCM buffer interleaved by channel. + /// + /// Sample buffer to commit. + /// Total number of samples in the buffer. + /// Sample buffer timestamp in nanoseconds. + public unsafe void CommitSamples (float* sampleBuffer, int sampleCount, long timestamp) => recorder.CommitSamples( + sampleBuffer, + sampleCount, + timestamp + ).CheckStatus(); + + /// + /// Finish writing. + /// + /// Path to recorded media file. + public Task FinishWriting () { + var tcs = new TaskCompletionSource(); + var handle = GCHandle.Alloc(tcs, GCHandleType.Normal); + try { + recorder.FinishWriting(OnFinishWriting, (IntPtr)handle).CheckStatus(); + } catch (Exception ex) { + handle.Free(); + tcs.SetException(ex); + } + return tcs.Task; + } + + /// + /// Create a media recorder. + /// NOTE: This requires an active VideoKit plan. + /// + /// Recorder format. + /// Video width. + /// Video height. + /// Video frame rate. + /// Audio sample rate. + /// Audio channel count. + /// Video bit rate in bits per second. + /// Keyframe interval in seconds. + /// Image compression quality in range [0, 1]. + /// Audio bit rate in bits per second. + /// Subdirectory name to save recordings. This will be created if it does not exist. + /// Created recorder. + public static async Task Create ( + MediaFormat format, + int width = 0, + int height = 0, + float frameRate = 0f, + int sampleRate = 0, + int channelCount = 0, + int videoBitRate = 10_000_000, + int keyframeInterval = 2, + float compressionQuality = 0.8f, + int audioBitRate = 64_000, + string prefix = null + ) { + // Check session + await VideoKitSettings.Instance.CheckSession(); + // Check video format + if (new [] { MediaFormat.MP4, MediaFormat.HEVC, MediaFormat.GIF, MediaFormat.WEBM, MediaFormat.JPEG }.Contains(format)) { + if (width <= 0) + throw new ArgumentException(@"Recorder width must be positive", nameof(width)); + if (height <= 0) + throw new ArgumentException(@"Recorder height must be positive", nameof(height)); + } + // Check audio format + if (new [] { MediaFormat.MP4, MediaFormat.HEVC, MediaFormat.WAV, MediaFormat.WEBM }.Contains(format)) { + if (!ValidSampleRates.Contains(sampleRate)) + throw new ArgumentException(@"Recorder sample rate must one of " + string.Join(", ", ValidSampleRates), nameof(width)); + if (channelCount < 0) + throw new ArgumentException(@"Recorder channel count must be non negative", nameof(height)); + } + // Check divisible by two + if (new [] { MediaFormat.MP4, MediaFormat.HEVC, MediaFormat.WEBM }.Contains(format)) { + if (width % 2 != 0) + throw new ArgumentException(@"Recorder width must be divisible by 2", nameof(width)); + if (height % 2 != 0) + throw new ArgumentException(@"Recorder height must be divisible by 2", nameof(height)); + } + // Create recorder + switch (format) { + case MediaFormat.MP4: return CreateMP4(width, height, frameRate, sampleRate, channelCount, videoBitRate, keyframeInterval, audioBitRate, prefix); + case MediaFormat.HEVC: return CreateHEVC(width, height, frameRate, sampleRate, channelCount, videoBitRate, keyframeInterval, audioBitRate, prefix); + case MediaFormat.GIF: return CreateGIF(width, height, frameRate, prefix); + case MediaFormat.WAV: return CreateWAV(sampleRate, channelCount, prefix); + case MediaFormat.JPEG: return CreateJPEG(width, height, compressionQuality, prefix); + case MediaFormat.WEBM: return CreateWEBM(width, height, frameRate, sampleRate, channelCount, videoBitRate, keyframeInterval, audioBitRate, prefix); + default: throw new InvalidOperationException($"Cannot create media recorder because format is not supported: {format}"); + } + } + #endregion + + + #region --Operations-- + private readonly IntPtr recorder; + private static string directory = string.Empty; + private static readonly int[] ValidSampleRates = new [] { 0, 8_000, 16_000, 22_050, 24_000, 44_100, 48_000 }; + + private MediaRecorder (IntPtr recorder, MediaFormat format) { + // Check + if (recorder == IntPtr.Zero) + throw new InvalidOperationException(@"Failed to create media recorder. Check the logs for more info."); + // Set + this.recorder = recorder; + this.format = format; + } + + public static implicit operator IntPtr (MediaRecorder recorder) => recorder.recorder; + + private static MediaRecorder CreateMP4 (int width, int height, float frameRate, int sampleRate, int channelCount, int videoBitRate, int keyframeInterval, int audioBitRate, string prefix) { + // Create + VideoKit.CreateMP4Recorder( + CreatePath(extension:@".mp4", prefix:prefix), + width, + height, + frameRate, + sampleRate, + channelCount, + videoBitRate, + keyframeInterval, + audioBitRate, + out var recorder + ).CheckStatus(); + // Return + return new MediaRecorder(recorder, MediaFormat.MP4); + } + + private static MediaRecorder CreateHEVC (int width, int height, float frameRate, int sampleRate, int channelCount, int videoBitRate, int keyframeInterval, int audioBitRate, string prefix) { + // Create + VideoKit.CreateHEVCRecorder( + CreatePath(extension: @".mp4", prefix: prefix), + width, + height, + frameRate, + sampleRate, + channelCount, + videoBitRate, + keyframeInterval, + audioBitRate, + out var recorder + ).CheckStatus(); + // Return + return new MediaRecorder(recorder, MediaFormat.HEVC); + } + + private static MediaRecorder CreateGIF (int width, int height, float frameRate, string prefix) { + // Create + VideoKit.CreateGIFRecorder( + CreatePath(extension: @".gif", prefix: prefix), + width, + height, + 1f / frameRate, + out var recorder + ).CheckStatus(); + // Return + return new MediaRecorder(recorder, MediaFormat.GIF); + } + + private static MediaRecorder CreateWAV (int sampleRate, int channelCount, string prefix) { + // Create + VideoKit.CreateWAVRecorder( + CreatePath(extension: @".wav", prefix: prefix), + sampleRate, + channelCount, + out var recorder + ).CheckStatus(); + // Return + return new MediaRecorder(recorder, MediaFormat.WAV); + } + + private static MediaRecorder CreateWEBM (int width, int height, float frameRate, int sampleRate, int channelCount, int videoBitRate, int keyframeInterval, int audioBitRate, string prefix) { + // Create + VideoKit.CreateWEBMRecorder( + CreatePath(extension: @".webm", prefix: prefix), + width, + height, + frameRate, + sampleRate, + channelCount, + videoBitRate, + keyframeInterval, + audioBitRate, + out var recorder + ).CheckStatus(); + // Return + return new MediaRecorder(recorder, MediaFormat.WEBM); + } + + private static MediaRecorder CreateJPEG (int width, int height, float quality, string prefix) { + // Create + VideoKit.CreateJPEGRecorder( + CreatePath(prefix: prefix), + width, + height, + quality, + out var recorder + ).CheckStatus(); + // Return + return new MediaRecorder(recorder, MediaFormat.JPEG); + } + + internal static string CreatePath (string? extension = null, string? prefix = null) { + // Create parent directory + var parentDirectory = !string.IsNullOrEmpty(prefix) ? Path.Combine(directory, prefix) : directory; + Directory.CreateDirectory(parentDirectory); + // Get recording path + var timestamp = DateTime.Now.ToString("yyyy_MM_dd_HH_mm_ss_fff"); + var name = $"recording_{timestamp}{extension ?? string.Empty}"; + var path = Path.Combine(parentDirectory, name); + // Return + return path; + } + + [RuntimeInitializeOnLoadMethod(RuntimeInitializeLoadType.AfterAssembliesLoaded)] + private static void OnInitialize () => directory = Application.isEditor ? + Directory.GetCurrentDirectory() : + Application.persistentDataPath; + + [MonoPInvokeCallback(typeof(VideoKit.RecordingHandler))] + private static unsafe void OnFinishWriting (IntPtr context, IntPtr path) { + // Check + if (!VideoKit.IsAppDomainLoaded) + return; + // Get tcs + TaskCompletionSource tcs; + try { + var handle = (GCHandle)context; + tcs = handle.Target as TaskCompletionSource; + handle.Free(); + } catch (Exception ex) { + Debug.LogException(ex); + return; + } + // Invoke + if (path != IntPtr.Zero) + tcs?.SetResult(Marshal.PtrToStringUTF8(path)); + else + tcs?.SetException(new Exception(@"Recorder failed to finish writing")); + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/MediaRecorder.cs.meta b/Runtime/MediaRecorder.cs.meta new file mode 100644 index 0000000..7adb415 --- /dev/null +++ b/Runtime/MediaRecorder.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: b0c8128c732234a138e37b2b3e9a83bc +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: 7728882fe6960415e897bae270b67d4e, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Outputs.meta b/Runtime/Outputs.meta new file mode 100644 index 0000000..4be1fc0 --- /dev/null +++ b/Runtime/Outputs.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 2c0fd8dbe1ff34114b452cc9f6443646 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Outputs/AudioClipOutput.cs b/Runtime/Outputs/AudioClipOutput.cs new file mode 100644 index 0000000..4906257 --- /dev/null +++ b/Runtime/Outputs/AudioClipOutput.cs @@ -0,0 +1,81 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Devices.Outputs { + + using System; + using System.IO; + using UnityEngine; + using Unity.Collections; + using Devices; + + /// + /// Audio device output that accumulates audio buffers into an `AudioClip`. + /// + public sealed class AudioClipOutput : AudioOutput { + + #region --Client API-- + /// + /// Create an audio clip output. + /// + public AudioClipOutput () { + this.sampleBuffer = new MemoryStream(); + this.fence = new object(); + } + + /// + /// Update the output with a new audio buffer. + /// + /// Audio buffer. + public override void Update (AudioBuffer audioBuffer) { + lock (fence) { + sampleRate = sampleRate == 0 ? audioBuffer.sampleRate : sampleRate; + channelCount = channelCount == 0 ? audioBuffer.channelCount : channelCount; + var audioData = new NativeSlice(audioBuffer.sampleBuffer).SliceConvert().ToArray(); + sampleBuffer.Write(audioData, 0, audioData.Length); + buffer = audioBuffer.Clone(); + } + } + + /// + /// Dispose the output and release resources. + /// + public override void Dispose () { + lock (fence) + sampleBuffer.Dispose(); + } + + /// + /// Get the current clip containing all audio recorded up till this point. + /// Note that this clip DOES NOT stream new audio that is provided to the output. + /// + public AudioClip ToClip () { + lock (fence) { + // Check + if (sampleRate == 0 || channelCount == 0) + return null; + // Get the full sample buffer + var byteSamples = sampleBuffer.ToArray(); + var totalSampleCount = byteSamples.Length / sizeof(float); + var floatSamples = new float[totalSampleCount]; + var recordingName = string.Format("recording_{0}", DateTime.Now.ToString("yyyy_MM_dd_HH_mm_ss_fff")); + Buffer.BlockCopy(byteSamples, 0, floatSamples, 0, byteSamples.Length); + // Create audio clip + var audioClip = AudioClip.Create(recordingName, totalSampleCount / channelCount, channelCount, sampleRate, false); + audioClip.SetData(floatSamples, 0); + return audioClip; + } + } + #endregion + + + #region --Operations-- + private readonly MemoryStream sampleBuffer; + private readonly object fence; + private int sampleRate; + private int channelCount; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Outputs/AudioClipOutput.cs.meta b/Runtime/Outputs/AudioClipOutput.cs.meta new file mode 100644 index 0000000..dc96a94 --- /dev/null +++ b/Runtime/Outputs/AudioClipOutput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 262befd793de94644a8213d1396e82f8 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Outputs/AudioOutput.cs b/Runtime/Outputs/AudioOutput.cs new file mode 100644 index 0000000..0d1287b --- /dev/null +++ b/Runtime/Outputs/AudioOutput.cs @@ -0,0 +1,46 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Devices.Outputs { + + using System; + + /// + /// + public abstract class AudioOutput { + + #region --Client API-- + /// + /// Latest audio buffer processed by the audio output. + /// + public AudioBuffer buffer { get; protected set; } + + /// + /// Latest audio buffer timestamp. + /// This is the timestamp of the latest buffer processed by the audio output. + /// + public long timestamp => buffer.timestamp; + + /// + /// Update the output with a new audio buffer. + /// + /// Audio buffer. + public abstract void Update (AudioBuffer audioBuffer); + + /// + /// Dispose the audio output and release resources. + /// + public virtual void Dispose () {} + #endregion + + + #region --Operations-- + /// + /// Implicitly convert the output to an audio buffer delegate. + /// + public static implicit operator Action (AudioOutput output) => output.Update; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Outputs/AudioOutput.cs.meta b/Runtime/Outputs/AudioOutput.cs.meta new file mode 100644 index 0000000..f2ef7d7 --- /dev/null +++ b/Runtime/Outputs/AudioOutput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 9bc7e49c569ee4ad3b83cf98d98b0a3b +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Outputs/CameraOutput.cs b/Runtime/Outputs/CameraOutput.cs new file mode 100644 index 0000000..c003b94 --- /dev/null +++ b/Runtime/Outputs/CameraOutput.cs @@ -0,0 +1,44 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Devices.Outputs { + + using System; + + /// + /// Camera device output which consumes camera images. + /// + public abstract class CameraOutput { + + #region --Client API-- + /// + /// Latest camera image processed by the camera output. + /// + public CameraImage image { get; protected set; } + + /// + /// Latest camera image timestamp. + /// This is the timestamp of the latest cimage processed by the camera output. + /// + public long timestamp => image.timestamp; + + /// + /// Update the output with a new camera image. + /// + /// Camera image. + public abstract void Update (CameraImage image); + + /// + /// Dispose the camera output and release resources. + /// + public virtual void Dispose () {} + #endregion + + + #region --Operations-- + public static implicit operator Action (CameraOutput output) => output.Update; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Outputs/CameraOutput.cs.meta b/Runtime/Outputs/CameraOutput.cs.meta new file mode 100644 index 0000000..a91f1b2 --- /dev/null +++ b/Runtime/Outputs/CameraOutput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 661c84148e38f458c82f2776e637f370 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Outputs/PixelBufferOutput.cs b/Runtime/Outputs/PixelBufferOutput.cs new file mode 100644 index 0000000..d649b72 --- /dev/null +++ b/Runtime/Outputs/PixelBufferOutput.cs @@ -0,0 +1,142 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Devices.Outputs { + + using System; + using System.Collections.Generic; + using System.Runtime.CompilerServices; + using UnityEngine; + using Unity.Collections; + using Unity.Collections.LowLevel.Unsafe; + using Internal; + using Utilities; + + /// + /// Camera device output that converts camera images into RGBA8888 pixel buffers. + /// + public sealed class PixelBufferOutput : CameraOutput { + + #region --Client API-- + /// + /// Pixel buffer conversion options. + /// + public class ConversionOptions { + /// + /// Desired pixel buffer orientation. + /// + public ScreenOrientation orientation; + /// + /// Whether to vertically mirror the pixel buffer. + /// + public bool mirror; + } + + /// + /// Pixel buffer with latest camera image. + /// The pixel buffer is always laid out in RGBA8888 format. + /// + public NativeArray pixelBuffer => convertedBuffer; + + /// + /// Pixel buffer width. + /// + public int width { get; private set; } + + /// + /// Pixel buffer height. + /// + public int height { get; private set; } + + /// + /// Get or set the pixel buffer orientation. + /// + public ScreenOrientation orientation; + + /// + /// Create a pixel buffer output. + /// + public PixelBufferOutput () { + this.orientation = OrientationSupport.Contains(Application.platform) ? Screen.orientation : 0; + this.lifecycleHelper = LifecycleHelper.Create(); + this.fence = new object(); + lifecycleHelper.onQuit += Dispose; + } + + /// + /// Update the output with a new camera image. + /// + /// Camera image. + public override void Update (CameraImage image) => Update(image, null); + + /// + /// Update the output with a new camera image. + /// + /// Camera image. + /// Conversion options. + public unsafe void Update (CameraImage image, ConversionOptions options) { + lock (fence) { + // Check + if (!lifecycleHelper) + return; + // Convert + var orientation = options?.orientation ?? this.orientation; + var mirror = options?.mirror ?? image.verticallyMirrored; + var bufferSize = image.width * image.height * 4; + EnsureCapacity(ref convertedBuffer, bufferSize); + EnsureCapacity(ref tempBuffer, bufferSize); + image.nativeImage.ConvertToRGBA8888( + (int)orientation, + mirror, + tempBuffer.GetUnsafePtr(), + convertedBuffer.GetUnsafePtr(), + out var width, + out var height + ); + // Update + this.width = width; + this.height = height; + this.image = image.Clone(); + } + } + + /// + /// Dispose the pixel buffer output and release resources. + /// + public override void Dispose () { + lock (fence) { + EnsureCapacity(ref convertedBuffer, 0); + EnsureCapacity(ref tempBuffer, 0); + if (lifecycleHelper) + lifecycleHelper.Dispose(); + } + } + #endregion + + + #region --Operations-- + internal readonly LifecycleHelper lifecycleHelper; + private readonly object fence; + private NativeArray convertedBuffer; + private NativeArray tempBuffer; + private static readonly List OrientationSupport = new List { + RuntimePlatform.Android, + RuntimePlatform.IPhonePlayer + }; + + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private static void EnsureCapacity (ref NativeArray buffer, int capacity) { + // Check + if (buffer.Length == capacity) + return; + // Dispose // Checking `IsCreated` doesn't prevent this + try { buffer.Dispose(); } catch (ObjectDisposedException) { } + // Recreate + if (capacity > 0) + buffer = new NativeArray(capacity, Allocator.Persistent); + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Outputs/PixelBufferOutput.cs.meta b/Runtime/Outputs/PixelBufferOutput.cs.meta new file mode 100644 index 0000000..720cb5b --- /dev/null +++ b/Runtime/Outputs/PixelBufferOutput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 8ee24c3988d5e4f1e865b68423449542 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Outputs/RenderTextureOutput.cs b/Runtime/Outputs/RenderTextureOutput.cs new file mode 100644 index 0000000..5e4709e --- /dev/null +++ b/Runtime/Outputs/RenderTextureOutput.cs @@ -0,0 +1,284 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Devices.Outputs { + + using System; + using System.Collections.Generic; + using System.Runtime.CompilerServices; + using System.Threading.Tasks; + using UnityEngine; + using Unity.Collections; + using Unity.Collections.LowLevel.Unsafe; + using Internal; + using Utilities; + + /// + /// Camera device output that streams camera images into a `RenderTexture` for display. + /// The render texture output performs necessary conversions entirely on the GPU. + /// This output can provide better performance than the `TextureOutput` when pixel data is not accessed on the CPU. + /// + public sealed class RenderTextureOutput : CameraOutput { + + #region --Client API-- + /// + /// RenderTexture conversion options. + /// + public sealed class ConversionOptions : PixelBufferOutput.ConversionOptions { } + + /// + /// Texture containing the latest camera image. + /// This is `null` when no image has been processed by the output. + /// + public readonly RenderTexture texture; + + /// + /// Get or set the pixel buffer orientation. + /// + public ScreenOrientation orientation; + + /// + /// Event raised when a new camera image is available in the texture output. + /// + public event Action OnFrame; + + /// + /// Create a RenderTexture output. + /// + public RenderTextureOutput () { + // Check + if (!SystemInfo.supportsComputeShaders) + throw new InvalidOperationException(@"RenderTextureOutput can only be used on platforms that support compute shaders"); + // Create texture + var descriptor = new RenderTextureDescriptor(16, 16, RenderTextureFormat.ARGB32, 0); + descriptor.enableRandomWrite = true; + this.texture = new RenderTexture(descriptor); + // Initiailize + this.shader = Resources.Load(@"RenderTextureOutput") as ComputeShader; + this.lifecycleHelper = LifecycleHelper.Create(); + this.orientation = DefaultOrientation; + this.conversionKernelMap = new Dictionary { + [CameraImage.Format.YCbCr420] = shader.FindKernel(@"ConvertYUV420"), + [CameraImage.Format.RGBA8888] = shader.FindKernel(@"ConvertRGBA8888"), + [CameraImage.Format.BGRA8888] = shader.FindKernel(@"ConvertBGRA8888"), + }; + this.rotationKernelMap = new Dictionary { + [ScreenOrientation.Portrait] = shader.FindKernel(@"Rotate90"), + [ScreenOrientation.LandscapeRight] = shader.FindKernel(@"Rotate180"), + [ScreenOrientation.PortraitUpsideDown] = shader.FindKernel(@"Rotate270"), + }; + this.conversionOffset = new int[4]; + this.conversionStride = new int[4]; + this.taskCompletionSource = new TaskCompletionSource(); + this.fence = new object(); + // Upload + lifecycleHelper.onUpdate += OnImageBuffer; + } + + /// + /// Update the output with a new camera image. + /// + /// Camera image. + public override void Update (CameraImage image) => Update(image, null); + + /// + /// Update the output with a new camera image. + /// + /// Camera image. + /// Conversion options. + public void Update (CameraImage image, ConversionOptions options) { + lock (fence) { + var bufferSize = image.width * image.height * 4 + 16 * image.height; // Overallocate in case of padding + pixelBuffer = pixelBuffer?.Length == bufferSize ? pixelBuffer : null; + pixelBuffer ??= new byte[bufferSize]; + CopyImage(image, pixelBuffer, conversionOffset, conversionStride); + imageBuffer = new ImageBuffer { + image = image.Clone(), + pixelBuffer = pixelBuffer, + mirror = options?.mirror ?? image.verticallyMirrored, + orientation = options?.orientation ?? this.orientation, + }; + } + } + + /// + /// Get a task that completes when the next frame is available. + /// + public Task NextFrame () => taskCompletionSource.Task; + + /// + /// Dispose the RenderTexture output and release resources. + /// + public override void Dispose () { + lock (fence) { + if (texture) { + texture.Release(); + RenderTexture.Destroy(texture); + } + if (lifecycleHelper) + lifecycleHelper.Dispose(); + conversionBuffer?.Dispose(); + pixelBuffer = null; + conversionBuffer = null; + } + } + #endregion + + + #region --Operations-- + private readonly ComputeShader shader; + private readonly LifecycleHelper lifecycleHelper; + private readonly IReadOnlyDictionary conversionKernelMap; + private readonly IReadOnlyDictionary rotationKernelMap; + private readonly int[] conversionOffset; + private readonly int[] conversionStride; + private readonly TaskCompletionSource taskCompletionSource; + private readonly object fence; + private byte[] pixelBuffer; + private ImageBuffer imageBuffer; + private ComputeBuffer conversionBuffer; + + private struct ImageBuffer { + public CameraImage image; + public byte[] pixelBuffer; + public bool mirror; + public ScreenOrientation orientation; + } + + private void OnImageBuffer () { + ImageBuffer imageBuffer; + lock (fence) { + // Check dirty + imageBuffer = this.imageBuffer; + if (timestamp == imageBuffer.image.timestamp) + return; + // Check buffer + if (conversionBuffer != null && conversionBuffer.count * conversionBuffer.stride != imageBuffer.pixelBuffer.Length) { + conversionBuffer.Dispose(); + conversionBuffer = null; + } + // Upload + conversionBuffer ??= new ComputeBuffer(imageBuffer.pixelBuffer.Length / sizeof(int), sizeof(int), ComputeBufferType.Raw, ComputeBufferMode.Immutable); + conversionBuffer.SetData(imageBuffer.pixelBuffer); + } + // Convert + var imageDescriptor = new RenderTextureDescriptor(imageBuffer.image.width, imageBuffer.image.height, RenderTextureFormat.ARGB32, 0); + imageDescriptor.enableRandomWrite = true; + var convertedTexture = RenderTexture.GetTemporary(imageDescriptor); + var conversionKernel = conversionKernelMap[imageBuffer.image.format]; + shader.SetBuffer(conversionKernel, @"Input", conversionBuffer); + shader.SetTexture(conversionKernel, @"Result", convertedTexture); + shader.SetInts(@"Offset", conversionOffset); + shader.SetInts(@"Stride", conversionStride); + shader.SetBool(@"Mirror", imageBuffer.mirror); + shader.GetKernelThreadGroupSizes(conversionKernel, out var gx, out var gy, out var _); + shader.Dispatch(conversionKernel, Mathf.CeilToInt((float)imageBuffer.image.width / gx), Mathf.CeilToInt((float)imageBuffer.image.height / gy), 1); + // Create + var portrait = imageBuffer.orientation == ScreenOrientation.Portrait || imageBuffer.orientation == ScreenOrientation.PortraitUpsideDown; + var width = portrait ? imageBuffer.image.height : imageBuffer.image.width; + var height = portrait ? imageBuffer.image.width : imageBuffer.image.height; + if (texture.width != width || texture.height != height) { + texture.Release(); + texture.width = width; + texture.height = height; + } + // Rotate + if (rotationKernelMap.TryGetValue(GetAdjustedOrientation(imageBuffer), out var rotationKernel)) { + shader.SetTexture(rotationKernel, @"Image", convertedTexture); + shader.SetTexture(rotationKernel, @"Result", texture); + shader.GetKernelThreadGroupSizes(rotationKernel, out var rx, out var ry, out var _); + shader.Dispatch(rotationKernel, Mathf.CeilToInt((float)width / rx), Mathf.CeilToInt((float)height / ry), 1); + } + else + Graphics.Blit(convertedTexture, texture); + RenderTexture.ReleaseTemporary(convertedTexture); + // Notify + image = imageBuffer.image; + taskCompletionSource.TrySetResult(texture); + OnFrame?.Invoke(this); + } + + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private static void CopyImage (in CameraImage image, byte[] buffer, int[] offset, int[] stride) { + var i420 = image.planes != null && image.planes[1].pixelStride == 1; + switch (image.format) { + case CameraImage.Format.YCbCr420 when i420: + CopyImageYUV420p(image, buffer, offset, stride); + break; + case CameraImage.Format.YCbCr420 when !i420: + CopyImageYUV420sp(image, buffer, offset, stride); + break; + case CameraImage.Format.BGRA8888: + case CameraImage.Format.RGBA8888: + CopyImageRGBA8888(image, buffer, offset, stride); + break; + } + } + + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private static void CopyImageRGBA8888 (in CameraImage image, byte[] buffer, int[] offset, int[] stride) { + NativeArray.Copy(image.pixelBuffer, buffer, image.pixelBuffer.Length); + Array.Clear(offset, 0, offset.Length); + stride[0] = image.pixelBuffer.Length / image.height; + } + + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private static void CopyImageYUV420p (in CameraImage image, byte[] buffer, int[] offset, int[] stride) { + var yBuffer = image.planes[0].buffer; + var cbBuffer = image.planes[1].buffer; + var crBuffer = image.planes[2].buffer; + NativeArray.Copy(yBuffer, buffer, yBuffer.Length); + NativeArray.Copy(cbBuffer, 0, buffer, yBuffer.Length, cbBuffer.Length); + NativeArray.Copy(crBuffer, 0, buffer, yBuffer.Length + cbBuffer.Length, crBuffer.Length); + offset[0] = 0; + offset[1] = yBuffer.Length; + offset[2] = yBuffer.Length + cbBuffer.Length; + offset[3] = 0; + stride[0] = image.planes[0].rowStride; + stride[1] = image.planes[1].rowStride; + stride[2] = image.planes[0].pixelStride; + stride[3] = image.planes[1].pixelStride; + } + + [MethodImpl(MethodImplOptions.AggressiveInlining)] + private static unsafe void CopyImageYUV420sp (in CameraImage image, byte[] buffer, int[] offset, int[] stride) { + var nv21 = image.planes.Length > 2 && image.planes[1].buffer.GetUnsafePtr() > image.planes[2].buffer.GetUnsafePtr(); + var yBuffer = image.planes[0].buffer; + var cbcrBuffer = nv21 ? image.planes[2].buffer : image.planes[1].buffer; + var cbcrLength = cbcrBuffer.Length + (cbcrBuffer.Length % 2); + NativeArray.Copy(yBuffer, buffer, yBuffer.Length); + fixed (byte* dstBuffer = buffer) + UnsafeUtility.MemCpy(dstBuffer + yBuffer.Length, cbcrBuffer.GetUnsafePtr(), cbcrLength); + offset[0] = 0; + offset[1] = nv21 ? yBuffer.Length - 1 : yBuffer.Length; + offset[2] = nv21 ? yBuffer.Length : yBuffer.Length + 1; + offset[3] = 0; + stride[0] = image.planes[0].rowStride; + stride[1] = image.planes[1].rowStride; + stride[2] = image.planes[0].pixelStride; + stride[3] = image.planes[1].pixelStride; + } + + private static ScreenOrientation DefaultOrientation => Application.platform switch { + RuntimePlatform.Android => Screen.orientation, + RuntimePlatform.IPhonePlayer => Screen.orientation, + _ => 0, + }; + + private static ScreenOrientation GetAdjustedOrientation (in ImageBuffer imageBuffer) { + if (Application.platform != RuntimePlatform.Android) + return imageBuffer.orientation; + if (imageBuffer.mirror) + return imageBuffer.orientation; + switch (imageBuffer.orientation) { + case ScreenOrientation.Portrait: return ScreenOrientation.PortraitUpsideDown; + case ScreenOrientation.LandscapeLeft: return ScreenOrientation.LandscapeRight; + case ScreenOrientation.LandscapeRight: return ScreenOrientation.LandscapeLeft; + default: return imageBuffer.orientation; + } + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Outputs/RenderTextureOutput.cs.meta b/Runtime/Outputs/RenderTextureOutput.cs.meta new file mode 100644 index 0000000..fdf9b54 --- /dev/null +++ b/Runtime/Outputs/RenderTextureOutput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 56f4c71b663394484af6969fd398499f +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Outputs/TextureOutput.cs b/Runtime/Outputs/TextureOutput.cs new file mode 100644 index 0000000..f30f939 --- /dev/null +++ b/Runtime/Outputs/TextureOutput.cs @@ -0,0 +1,119 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Devices.Outputs { + + using System; + using System.Runtime.CompilerServices; + using System.Threading.Tasks; + using UnityEngine; + + /// + /// Camera device output that streams camera images into a `Texture2D`. + /// The texture output uses the `PixelBufferOutput` to convert camera images to `RGBA8888` before uploading to the GPU. + /// The rendered texture data is accessible on the CPU using the `Texture2D` data access methods. + /// + public sealed class TextureOutput : CameraOutput { + + #region --Client API-- + /// + /// Texture conversion options. + /// + public class ConversionOptions : PixelBufferOutput.ConversionOptions { } + + /// + /// Texture containing the latest camera image. + /// This is `null` when no image has been processed by the output. + /// + public readonly Texture2D texture; + + /// + /// Get or set the texture orientation. + /// + public ScreenOrientation orientation { + get => pixelBufferOutput.orientation; + set => pixelBufferOutput.orientation = value; + } + + /// + /// Event raised when a new camera image is available in the texture output. + /// + public event Action OnFrame; + + /// + /// Create a texture output. + /// + public TextureOutput () { + this.texture = new Texture2D(16, 16, TextureFormat.RGBA32, false, false); + this.pixelBufferOutput = new PixelBufferOutput(); + this.taskCompletionSource = new TaskCompletionSource(); + this.fence = new object(); + pixelBufferOutput.lifecycleHelper.onUpdate += OnPixelBuffer; + } + + /// + /// Update the output with a new camera image. + /// + /// Camera image. + public override void Update (CameraImage image) => Update(image, null); + + /// + /// Update the output with a new camera image. + /// + /// Camera image. + /// Conversion options. + public void Update (CameraImage image, ConversionOptions options) { + lock (fence) + pixelBufferOutput.Update(image, options); + } + + /// + /// Get a task that completes when the next frame is available. + /// + public Task NextFrame () => taskCompletionSource.Task; + + /// + /// Dispose the texture output and release resources. + /// + public override void Dispose () { + lock (fence) { + pixelBufferOutput.Dispose(); + taskCompletionSource.TrySetCanceled(); + Texture2D.Destroy(texture); + } + } + #endregion + + + #region --Operations-- + private readonly PixelBufferOutput pixelBufferOutput; + private readonly TaskCompletionSource taskCompletionSource; + private readonly object fence; + + private void OnPixelBuffer () { + lock (fence) { + // Check first frame + if (pixelBufferOutput.timestamp == 0L) + return; + // Check dirty + if (timestamp == pixelBufferOutput.timestamp) + return; + // Check texture + var (width, height) = (pixelBufferOutput.width, pixelBufferOutput.height); + if (texture.width != width || texture.height != height) + texture.Reinitialize(pixelBufferOutput.width, pixelBufferOutput.height); + // Upload + texture.GetRawTextureData().CopyFrom(pixelBufferOutput.pixelBuffer); + texture.Apply(); + // Update timestamp + image = pixelBufferOutput.image; + } + // Notify + taskCompletionSource.TrySetResult(texture); + OnFrame?.Invoke(this); + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Outputs/TextureOutput.cs.meta b/Runtime/Outputs/TextureOutput.cs.meta new file mode 100644 index 0000000..97138d8 --- /dev/null +++ b/Runtime/Outputs/TextureOutput.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 8b95039575e56474e8531f922b844414 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {instanceID: 0} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/UI.meta b/Runtime/UI.meta new file mode 100644 index 0000000..179f127 --- /dev/null +++ b/Runtime/UI.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 8a2469b7f93f0498298587eb032fe505 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/UI/VideoKitCameraView.cs b/Runtime/UI/VideoKitCameraView.cs new file mode 100644 index 0000000..9fd02e0 --- /dev/null +++ b/Runtime/UI/VideoKitCameraView.cs @@ -0,0 +1,177 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.UI { + + using UnityEngine; + using UnityEngine.Events; + using UnityEngine.EventSystems; + using UnityEngine.UI; + using Devices; + + /// + /// VideoKit UI component for displaying the camera preview from a camera manager. + /// + [Tooltip(@"VideoKit UI component for displaying the camera preview from a camera manager.")] + [RequireComponent(typeof(RawImage), typeof(AspectRatioFitter), typeof(EventTrigger))] + [HelpURL(@"https://docs.videokit.ai/videokit/api/videokitcameraview")] + [DisallowMultipleComponent] + public sealed partial class VideoKitCameraView : MonoBehaviour, IPointerUpHandler, IBeginDragHandler, IDragHandler { + + #region --Enumerations-- + /// + /// View mode. + /// + public enum ViewMode : int { + /// + /// Display the camera texture. + /// + CameraTexture = 0, + /// + /// Display the human texture. + /// + HumanTexture = 1, + } + + /// + /// Gesture mode. + /// + public enum GestureMode : int { + /// + /// Do not respond to gestures. + /// + None = 0, + /// + /// Detect tap gestures. + /// + Tap = 1, + /// + /// Detect two-finger pinch gestures. + /// + Pinch = 2, + /// + /// Detect single-finger drag gestures. + /// This gesture mode is recommended when the user is holding a button to record a video. + /// + Drag = 3, + } + #endregion + + + #region --Inspector-- + [Header(@"Configuration")] + /// + /// VideoKit camera manager. + /// + [Tooltip(@"VideoKit camera manager.")] + public VideoKitCameraManager cameraManager; + + /// + /// View mode of the view. + /// + [Tooltip(@"View mode of the view.")] + public ViewMode viewMode = ViewMode.CameraTexture; + + [Header(@"Gestures")] + /// + /// Focus gesture mode. + /// + [Tooltip(@"Focus gesture mode.")] + public GestureMode focusMode = GestureMode.None; + + /// + /// Exposure gesture mode. + /// + [Tooltip(@"Exposure gesture mode.")] + public GestureMode exposureMode = GestureMode.None; + + /// + /// Zoom gesture mode. + /// + [Tooltip(@"Zoom gesture mode.")] + public GestureMode zoomMode = GestureMode.None; + + [Header(@"Events")] + /// + /// Event raised when the camera preview is presented on the UI panel. + /// + [Tooltip(@"Event raised when the camera preview is presented on the UI panel.")] + public UnityEvent OnPresent; + #endregion + + + #region --Operations-- + private RawImage rawImage; + private AspectRatioFitter aspectFitter; + private bool presented; + + private void Reset () => cameraManager = FindObjectOfType(); + + private void Awake () { + // Get components + rawImage = GetComponent(); + aspectFitter = GetComponent(); + presented = false; + // Listen for frames + if (cameraManager) + cameraManager.OnCameraFrame.AddListener(OnCameraFrame); + } + + private void Update () => presented &= rawImage.texture; + + private void OnCameraFrame () { + // Check + if (!isActiveAndEnabled) + return; + // Get texture + var texture = viewMode == ViewMode.HumanTexture ? cameraManager.humanTexture : cameraManager.texture; + if (!texture) + return; + // Display + rawImage.texture = texture; + aspectFitter.aspectRatio = (float)texture.width / texture.height; + // Invoke event + if (!presented) + OnPresent?.Invoke(this); + presented = true; + } + + private void OnDisable () => presented = false; + + void IPointerUpHandler.OnPointerUp (PointerEventData data) { + // Check manager + if (!cameraManager) + return; + // Check focus mode + if (focusMode != GestureMode.Tap && exposureMode != GestureMode.Tap) + return; + // Get press position + var rectTransform = transform as RectTransform; + if (!RectTransformUtility.ScreenPointToLocalPointInRectangle( + rectTransform, + data.position, + data.pressEventCamera, // or `enterEventCamera` + out var localPoint + )) + return; + // Focus + var cameraDevice = cameraManager.device; + var point = Rect.PointToNormalized(rectTransform.rect, localPoint); + if (cameraDevice.focusPointSupported && focusMode == GestureMode.Tap) + cameraDevice.SetFocusPoint(point.x, point.y); + if (cameraDevice.exposurePointSupported && exposureMode == GestureMode.Tap) + cameraDevice.SetExposurePoint(point.x, point.y); + } + + void IBeginDragHandler.OnBeginDrag (PointerEventData data) { // INCOMPLETE + + } + + void IDragHandler.OnDrag (PointerEventData data) { // INCOMPLETE + + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/UI/VideoKitCameraView.cs.meta b/Runtime/UI/VideoKitCameraView.cs.meta new file mode 100644 index 0000000..6a018f7 --- /dev/null +++ b/Runtime/UI/VideoKitCameraView.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 145a5a333d6e14379ae91956c0b3373e +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: 7728882fe6960415e897bae270b67d4e, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/UI/VideoKitRecordButton.cs b/Runtime/UI/VideoKitRecordButton.cs new file mode 100644 index 0000000..12c360a --- /dev/null +++ b/Runtime/UI/VideoKitRecordButton.cs @@ -0,0 +1,110 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.UI { + + using System.Collections; + using UnityEngine; + using UnityEngine.UI; + using UnityEngine.Events; + using UnityEngine.EventSystems; + + /// + /// Lightweight record button with press-and-hold gesture. + /// + [RequireComponent(typeof(Image)), AddComponentMenu(@""), DisallowMultipleComponent] + public sealed class VideoKitRecordButton : MonoBehaviour, IPointerDownHandler, IPointerUpHandler { + + #region --Inspector-- + [Header(@"Settings")] + /// + /// Recorder to control. + /// + [Tooltip(@"Recorder to control.")] + public VideoKitRecorder recorder; + + /// + /// Maximum duration that button can be pressed. + /// + [Range(5f, 60f), Tooltip(@"Maximum duration that button can be pressed.")] + public float maxDuration = 10f; + + [Header(@"UI")] + /// + /// Countdown image. + /// + [Tooltip(@"Countdown image.")] + public Image countdown; + + [Header(@"Events")] + /// + /// Event invoked when recording is started. + /// + [Tooltip(@"Event invoked when recording is started.")] + public UnityEvent OnStartRecording; + + /// + /// Event invoked when recording is stopped. + /// + [Tooltip(@"Event invoked when recording is stopped.")] + public UnityEvent OnStopRecording; + #endregion + + + #region --Operations-- + private Image button; + private bool touch; + + private void Reset () => recorder = FindObjectOfType(); + + private void Awake () => button = GetComponent(); + + private void Start () => Zero(); + + private void Zero () { + button.fillAmount = 1.0f; + countdown.fillAmount = 0.0f; + } + + private IEnumerator Countdown () { + touch = true; + // Wait for false touch + yield return new WaitForSeconds(0.2f); + if (!touch) + yield break; + // Start recording + StartRecording(); + // Animate the countdown + var startTime = Time.time; + while (touch) { + var ratio = (Time.time - startTime) / maxDuration; + touch = ratio <= 1f; + countdown.fillAmount = ratio; + button.fillAmount = 1f - ratio; + yield return null; + } + // Reset + Zero(); + StopRecording(); + } + + private async void StartRecording () { + if (recorder != null) + await recorder.StartRecording(); + OnStartRecording?.Invoke(); + } + + private async void StopRecording () { + if (recorder != null) + await recorder.StopRecording(); + OnStopRecording?.Invoke(); + } + + void IPointerDownHandler.OnPointerDown (PointerEventData eventData) => StartCoroutine(Countdown()); + + void IPointerUpHandler.OnPointerUp (PointerEventData eventData) => touch = false; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/UI/VideoKitRecordButton.cs.meta b/Runtime/UI/VideoKitRecordButton.cs.meta new file mode 100644 index 0000000..782a441 --- /dev/null +++ b/Runtime/UI/VideoKitRecordButton.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 0ac74b706104a475c98f83f30de96193 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: 7728882fe6960415e897bae270b67d4e, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Utilities.meta b/Runtime/Utilities.meta new file mode 100644 index 0000000..577a890 --- /dev/null +++ b/Runtime/Utilities.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: d2b26bd7399014052ac56d94079f9f3a +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Utilities/LifecycleHelper.cs b/Runtime/Utilities/LifecycleHelper.cs new file mode 100644 index 0000000..577973f --- /dev/null +++ b/Runtime/Utilities/LifecycleHelper.cs @@ -0,0 +1,58 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Utilities { + + using System; + using UnityEngine; + + [DefaultExecutionOrder(-1000)] + internal sealed class LifecycleHelper : MonoBehaviour, IDisposable { + + #region --Client API-- + /// + /// Event invoked when Unity's `Update` message is invoked. + /// + public event Action onUpdate; + + /// + /// Event invoked when the app is paused or resumed. + /// + public event Action onPause; + + /// + /// Event invoked when the app is exiting. + /// + public event Action onQuit; + + /// + /// Create a lifecycle helper. + /// + public static LifecycleHelper Create () => new GameObject(@"VideoKit Helper").AddComponent(); + + /// + /// Dispose the helper. + /// + public void Dispose () { + onPause = null; + onQuit = null; + Destroy(gameObject); + DestroyImmediate(this); + } + #endregion + + + #region --Operations-- + + private void Awake () => DontDestroyOnLoad(gameObject); + + private void Update () => onUpdate?.Invoke(); + + private void OnApplicationPause (bool pause) => onPause?.Invoke(pause); + + private void OnApplicationQuit () => onQuit?.Invoke(); + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Utilities/LifecycleHelper.cs.meta b/Runtime/Utilities/LifecycleHelper.cs.meta new file mode 100644 index 0000000..39296ad --- /dev/null +++ b/Runtime/Utilities/LifecycleHelper.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: ca8819ff6c2d343cbab5032c8a41d8c9 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Utilities/RingBuffer.cs b/Runtime/Utilities/RingBuffer.cs new file mode 100644 index 0000000..8bbd9d7 --- /dev/null +++ b/Runtime/Utilities/RingBuffer.cs @@ -0,0 +1,146 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Utilities { + + using System; + using Unity.Collections; + using Unity.Collections.LowLevel.Unsafe; + + internal sealed class RingBuffer where T : unmanaged { + + #region --Client API-- + /// + /// Current buffer length. + /// This is the number of elements that can be read from the buffer. + /// + public int Length => (write - read + 1); + + /// + /// Number of elements that can be written to the buffer. + /// + public int Available => Capacity - Length; + + /// + /// Buffer capacity. + /// + public int Capacity => buffer.Length; + + /// + /// Create a ring buffer. + /// + /// Buffer capacity. + public RingBuffer (int capacity) { + this.buffer = new T[capacity]; + Clear(); + } + + /// + /// Clear the buffer. + /// + public void Clear () { + this.read = 0; + this.write = -1; + Array.Clear(buffer, 0, buffer.Length); + } + + /// + /// Read elements from the buffer. + /// + /// Destination array. + public void Read (T[] destination) => Read(destination, 0, destination.Length); + + /// + /// Read elements from the buffer. + /// + /// Destinationa array. + /// Destination start index. + /// Number of elements to read. + public unsafe void Read (T[] destination, int index, int length) { + fixed (T* baseAddress = destination) + Read(&baseAddress[index], length); + } + + /// + /// Read elements from the buffer. + /// + /// Destinationa array. + public unsafe void Read (NativeArray destination) => Read((T*)destination.GetUnsafeReadOnlyPtr(), destination.Length); + + /// + /// Read elements from the buffer. + /// + /// Destination buffer. + /// Number of elements to read. + public unsafe void Read (T* destination, int length) { + // Check + if (length > Length) + throw new InvalidOperationException(@"Cannot read because buffer length is less than destination length"); + // Read + var startIndex = read % Capacity; + var endIndex = (startIndex + length) % Capacity; + var firstCopyLength = endIndex < startIndex ? Capacity - startIndex : endIndex - startIndex; + var secondCopyLength = length - firstCopyLength; + fixed (T* baseAddress = buffer) { + UnsafeUtility.MemCpy(destination, &baseAddress[startIndex], firstCopyLength * sizeof(T)); + UnsafeUtility.MemCpy(&destination[firstCopyLength], baseAddress, secondCopyLength * sizeof(T)); + } + // Update marker + read += length; + } + + /// + /// Write elements to the buffer. + /// + /// Source array. + public void Write (T[] source) => Write(source, 0, source.Length); + + /// + /// Write elements to the buffer. + /// + /// Source array. + /// Source start index. + /// Number of elements to write. + public unsafe void Write (T[] source, int index, int length) { + fixed (T* baseAddress = source) + Write(&baseAddress[index], length); + } + + /// + /// Write elements to the buffer. + /// + /// Source array. + public unsafe void Write (NativeArray source) => Write((T*)source.GetUnsafeReadOnlyPtr(), source.Length); + + /// + /// + /// Source buffer. + /// Number of elements to write. + public unsafe void Write (T* source, int length) { + // Check + if (length > Available) + throw new InvalidOperationException(@"Cannot write because source length exceeds available space"); + // Write + var startIndex = (write + 1) % Capacity; + var endIndex = (startIndex + length) % Capacity; + var firstCopyLength = endIndex < startIndex ? Capacity - startIndex : endIndex - startIndex; + var secondCopyLength = length - firstCopyLength; + fixed (T* baseAddress = buffer) { + UnsafeUtility.MemCpy(&baseAddress[startIndex], source, firstCopyLength * sizeof(T)); + UnsafeUtility.MemCpy(baseAddress, &source[firstCopyLength], secondCopyLength * sizeof(T)); + } + // Update marker + write += length; + } + #endregion + + + #region --Operations-- + private readonly T[] buffer; + private int read; + private int write; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Utilities/RingBuffer.cs.meta b/Runtime/Utilities/RingBuffer.cs.meta new file mode 100644 index 0000000..7c9e514 --- /dev/null +++ b/Runtime/Utilities/RingBuffer.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 4569f2514d82c4828801b48ba0cc610c +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/Utilities/SharedSignal.cs b/Runtime/Utilities/SharedSignal.cs new file mode 100644 index 0000000..6274cff --- /dev/null +++ b/Runtime/Utilities/SharedSignal.cs @@ -0,0 +1,60 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit.Utilities { + + using System; + using System.Collections.Generic; + using System.Runtime.CompilerServices; + + internal sealed class SharedSignal { + + #region --Client API-- + /// + /// Whether the shared signal has been triggered. + /// + public bool signaled { + [MethodImpl(MethodImplOptions.Synchronized)] + get; + private set; + } + + /// + /// Event raised when the shared signal is triggered. + /// + public event Action OnSignal; + + /// + /// Create a shared signal. + /// + /// Number of unique signals required to trigger the shared signal. + public SharedSignal (int count) { + this.record = new HashSet(); + this.count = count; + } + + /// + /// Send a signal. + /// + /// Key to identify signal. + [MethodImpl(MethodImplOptions.Synchronized)] + public void Signal (object key) { + if (signaled) + return; + record.Add(key); + if (record.Count == count) { + OnSignal?.Invoke(); + signaled = true; + } + } + #endregion + + + #region --Operations-- + private readonly int count; + private readonly HashSet record; + #endregion + } +} \ No newline at end of file diff --git a/Runtime/Utilities/SharedSignal.cs.meta b/Runtime/Utilities/SharedSignal.cs.meta new file mode 100644 index 0000000..13c1279 --- /dev/null +++ b/Runtime/Utilities/SharedSignal.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 63d4af2a63f5648c2b19dd022aa180df +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/VideoKit.Runtime.asmdef b/Runtime/VideoKit.Runtime.asmdef new file mode 100644 index 0000000..430181b --- /dev/null +++ b/Runtime/VideoKit.Runtime.asmdef @@ -0,0 +1,24 @@ +{ + "name": "VideoKit.Runtime", + "rootNamespace": "", + "references": [ + "Function.Runtime", + "NatML.Runtime" + ], + "includePlatforms": [ + "Android", + "Editor", + "iOS", + "macOSStandalone", + "WebGL", + "WindowsStandalone64" + ], + "excludePlatforms": [], + "allowUnsafeCode": true, + "overrideReferences": false, + "precompiledReferences": [], + "autoReferenced": true, + "defineConstraints": [], + "versionDefines": [], + "noEngineReferences": false +} \ No newline at end of file diff --git a/Runtime/VideoKit.Runtime.asmdef.meta b/Runtime/VideoKit.Runtime.asmdef.meta new file mode 100644 index 0000000..474d790 --- /dev/null +++ b/Runtime/VideoKit.Runtime.asmdef.meta @@ -0,0 +1,7 @@ +fileFormatVersion: 2 +guid: 73f8e9241b6244bd796104330e413694 +AssemblyDefinitionImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/VideoKitAudioManager.cs b/Runtime/VideoKitAudioManager.cs new file mode 100644 index 0000000..dc12f3e --- /dev/null +++ b/Runtime/VideoKitAudioManager.cs @@ -0,0 +1,193 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit { + + using System; + using System.Linq; + using System.Threading.Tasks; + using UnityEngine; + using UnityEngine.Events; + using Devices; + using Internal; + + /// + /// VideoKit audio manager for streaming audio from audio devices. + /// + [Tooltip(@"VideoKit audio manager for streaming audio from audio devices.")] + [HelpURL(@"https://docs.videokit.ai/videokit/api/videokitaudiomanager")] + [DisallowMultipleComponent] + public sealed class VideoKitAudioManager : VideoKitDeviceManager { + + #region --Enumerations-- + /// + /// Audio sample rate. + /// + public enum SampleRate : int { + /// + /// Match Unity's audio DSP sample rate. + /// + MatchUnity = 0, + /// + /// 8KHz. + /// + _8000 = 8000, + /// + /// 16KHz. + /// + _16000 = 16000, + /// + /// 22.05KHz. + /// + _22050 = 22050, + /// + /// 24KHz. + /// + _24000 = 24000, + /// + /// 44.1KHz. + /// + _44100 = 44100, + /// + /// 48KHz. + /// + _48000 = 48000, + } + + /// + /// Audio channel count. + /// + public enum ChannelCount : int { + /// + /// Match Unity's audio DSP channel count. + /// + MatchUnity = 0, + /// + /// Mono audio. + /// + Mono = 1, + /// + /// Stereo audio. + /// + Stereo = 2, + } + #endregion + + + #region --Inspector-- + [Header(@"Configuration")] + /// + /// Configure the application audio session on awake. + /// This only applies on iOS. + /// + [Tooltip(@"Configure the application audio session on awake. This only applies on iOS.")] + public bool configureOnAwake = true; + + [Header(@"Format")] + /// + /// Audio sample rate. + /// + [Tooltip(@"Audio sample rate.")] + public SampleRate sampleRate = SampleRate.MatchUnity; + + /// + /// Audio channel count. + /// + [Tooltip(@"Audio channel count.")] + public ChannelCount channelCount = ChannelCount.MatchUnity; + + /// + /// Request echo cancellation if the device supports it. + /// + [Tooltip(@"Request echo cancellation if the device supports it.")] + public bool echoCancellation = false; + #endregion + + + #region --Client API-- + /// + /// Get or set the audio device used for streaming. + /// + public override AudioDevice device { + get => _device; + set { + // Switch mic without disposing output + // We deliberately skip configuring the mic like we do in `StartRunning` + if (running) { + _device.StopRunning(); + _device = value; + _device?.StartRunning(OnSampleBuffer); + } + // Handle trivial case + else + _device = value; + } + } + + /// + /// Whether the audio device is running. + /// + public override bool running => _device?.running ?? false; + + /// + /// Event raised when a new audio buffer is available. + /// + public event Action OnAudioBuffer; + + /// + /// Start streaming audio. + /// + public override async Task StartRunning () { + // Check + if (!isActiveAndEnabled) + throw new InvalidOperationException(@"VideoKit: Audio manager failed to start running because component is disabled"); + // Check + if (running) + return; + // Request microphone permissions + var permissions = await AudioDevice.CheckPermissions(request: true); + if (permissions != MediaDevice.PermissionStatus.Authorized) + throw new InvalidOperationException(@"VideoKit: User did not grant microphone permissions"); + // Check device + var devices = await AudioDevice.Discover(configureAudioSession: false); // configure once in `Awake` instead. + _device ??= devices.FirstOrDefault(); + if (_device == null) + throw new InvalidOperationException(@"VideoKit: Audio manager failed to start running because no audio device is available"); + // Configure microphone + _device.sampleRate = sampleRate == SampleRate.MatchUnity ? AudioSettings.outputSampleRate : (int)sampleRate; + _device.channelCount = channelCount == ChannelCount.MatchUnity ? (int)AudioSettings.speakerMode : (int)channelCount; + _device.echoCancellation = echoCancellation; // devices can say they don't support AEC even when they do + // Start running + _device.StartRunning(OnSampleBuffer); + } + + /// + /// Stop streaming audio. + /// + public override void StopRunning () { + // Stop + if (running) + _device.StopRunning(); + } + #endregion + + + #region --Operations-- + private AudioDevice _device; + + private void Awake () { + if (configureOnAwake) + VideoKitExt.ConfigureAudioSession(); + } + + private void OnSampleBuffer (AudioBuffer audioBuffer) => OnAudioBuffer?.Invoke(audioBuffer); + + private void OnDestroy () { + if (running) + StopRunning(); + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/VideoKitAudioManager.cs.meta b/Runtime/VideoKitAudioManager.cs.meta new file mode 100644 index 0000000..f456129 --- /dev/null +++ b/Runtime/VideoKitAudioManager.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 88aa3c596777f4cf489ae6096a962171 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: 7728882fe6960415e897bae270b67d4e, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/VideoKitCameraManager.cs b/Runtime/VideoKitCameraManager.cs new file mode 100644 index 0000000..6df7f3e --- /dev/null +++ b/Runtime/VideoKitCameraManager.cs @@ -0,0 +1,412 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit { + + using System; + using System.Collections; + using System.Collections.Generic; + using System.Linq; + using System.Threading.Tasks; + using UnityEngine; + using UnityEngine.Events; + using Unity.Collections; + using Unity.Collections.LowLevel.Unsafe; + using NatML; + using NatML.Features; + using AI; + using Devices; + using Devices.Outputs; + using Internal; + + /// + /// VideoKit camera manager for streaming video from camera devices. + /// + [Tooltip(@"VideoKit camera manager for streaming video from camera devices.")] + [HelpURL(@"https://docs.videokit.ai/videokit/api/videokitcameramanager")] + [DisallowMultipleComponent] + public sealed class VideoKitCameraManager : VideoKitDeviceManager { + + #region --Enumerations-- + /// + /// Camera manager capabilities. + /// + [Flags] + public enum Capabilities : int { + /// + /// Stream depth data along with the camera preview data. + /// This flag adds a minimal performance cost, so enable it only when necessary. + /// This flag is only supported on iOS and Android. + /// + Depth = 0b0001, + /// + /// Ensure that the camera preview data can be used for AI predictions. + /// Enabling this adds a minimal performance impact so it should be disabled when not needed. + /// This flag is always enabled on WebGL. + /// + AI = 0b00010, + /// + /// Generate a human texture from the camera preview stream. + /// This flag adds a variable performance cost, so enable it only when necessary. + /// NOTE: This requires an active VideoKit AI plan. + /// + HumanTexture = AI | 0b00100, + /// + /// Detect human poses in the camera preview stream. + /// This flag adds a variable performance cost, so enable it only when necessary. + /// NOTE: This requires an active VideoKit AI plan. + /// + PoseDetection = AI | 0b01000, + /// + /// Detect faces in the camera preview stream. + /// This flag adds a variable performance cost, so enable it only when necessary. + /// NOTE: This requires an active VideoKit AI plan. + /// + FaceDetection = AI | 0b10000, + } + + /// + /// Camera facing. + /// + public enum Facing : int { // bit flags: [prefer/require user/world] + /// + /// Prefer a user-facing camera but enable fallback to any available camera. + /// + PreferUser = 0b00, + /// + /// Prefer a world-facing camera but enable fallback to any available camera. + /// + PreferWorld = 0b01, + /// + /// Require a user-facing camera. + /// + RequireUser = 0b10, + /// + /// Require a world-facing camera. + /// + RequireWorld = 0b11, + } + + /// + /// Camera resolution presets. + /// + public enum Resolution : int { + /// + /// Use the default camera resolution. + /// With this preset, the camera resolution will not be set. + /// + Default = 0, + /// + /// Lowest resolution supported by the camera device. + /// + Lowest = 1, + /// + /// SD resolution. + /// + _640x480 = 2, + /// + /// HD resolution. + /// + _1280x720 = 3, + /// + /// Full HD resolution. + /// + _1920x1080 = 4, + /// + /// 4K UHD resolution. + /// + _4K = 5, + /// + /// Highest resolution supported by the camera device. + /// Using this resolution is strongly not recommended. + /// + Highest = 10 + } + + /// + /// + public enum FrameRate : int { + /// + /// Use the default camera frame rate. + /// With this preset, the camera frame rate will not be set. + /// + Default = 0, + /// + /// + Lowest = 1, + /// + /// + _15 = 15, + /// + /// + _30 = 30, + /// + /// + _60 = 60, + /// + /// + _120 = 120, + /// + /// + _240 = 240 + } + #endregion + + + #region --Inspector-- + [Header(@"Configuration")] + /// + /// Desired camera capabilities. + /// + [Tooltip(@"Desired camera capabilities.")] + public Capabilities capabilities = 0; + + /// + /// Whether to start the camera preview as soon as the component awakes. + /// + [Tooltip(@"Whether to start the camera preview as soon as the component awakes.")] + public bool playOnAwake = true; + + [Header(@"Camera Settings")] + /// + /// Desired camera facing. + /// + [SerializeField, Tooltip(@"Desired camera facing.")] + private Facing _facing = Facing.PreferUser; + + /// + /// Desired camera resolution. + /// + [Tooltip(@"Desired camera resolution.")] + public Resolution resolution = Resolution._1280x720; + + /// + /// Desired camera frame rate. + /// + [Tooltip(@"Desired camera frame rate.")] + public FrameRate frameRate = FrameRate._30; + + /// + /// Desired camera focus mode. + /// + [Tooltip(@"Desired camera focus mode.")] + public CameraDevice.FocusMode focusMode = CameraDevice.FocusMode.Continuous; + + /// + /// Desired camera exposure mode. + /// + [Tooltip(@"Desired camera exposure mode.")] + public CameraDevice.ExposureMode exposureMode = CameraDevice.ExposureMode.Continuous; + + [Header(@"Events")] + /// + /// Event raised when a new camera frame is available in the camera manager. + /// The preview texture, human texture, and image feature will contain the latest camera image. + /// + [Tooltip(@"Event raised when a new camera frame is available.")] + public UnityEvent OnCameraFrame; + #endregion + + + #region --Client API-- + /// + /// Get or set the camera device used for streaming. + /// + public override CameraDevice device { + get => _device; + set { + // Switch cameras without disposing output + // We deliberately skip configuring the camera like we do in `StartRunning` + if (running) { + _device.StopRunning(); + _device = value; + _device?.StartRunning(output); + } + // Handle trivial case + else + _device = value; + } + } + + /// + /// Get or set the desired camera facing. + /// + public Facing facing { + get => _facing; + set { + if (_facing == value) + return; + _facing = value; + device = GetDefaultCameraDevice(_facing); + } + } + + /// + /// Get the latest camera image that has been processed by the camera manager. + /// + public CameraImage image => output?.image ?? default; + + /// + /// Get the camera preview texture. + /// + public Texture texture => output switch { + TextureOutput x => x.texture, + RenderTextureOutput x => x.texture, + _ => null, + }; + + /// + /// Get the camera preview pixel buffer. + /// The pixel buffer always has the `RGBA8888` layout. + /// NOTE: This requires the `AI` capability to be enabled. + /// + public NativeArray pixelBuffer => output switch { + TextureOutput x => x.texture.GetRawTextureData(), + _ => default, + }; + + /// + /// Get the camera preview image feature for AI predictions. + /// NOTE: This requires the `AI` capability to be enabled. + /// + public MLImageFeature imageFeature => output is TextureOutput textureOutput ? new MLImageFeature(textureOutput.texture) : null; + + /// + /// Get the camera human texture. + /// NOTE: This requires the `HumanTexture` capability to be enabled. + /// + public Texture2D humanTexture => matteKit?.humanTexture; + + /// + /// Whether the camera is running. + /// + public override bool running => output != null; + + /// + /// Event raised when a new camera image is provided by the camera device. + /// NOTE: This event is usually raised on the camera thread, not the Unity main thread. + /// + public event Action OnCameraImage; + + /// + /// Start the camera preview. + /// + public override async Task StartRunning () { + // Check + if (!isActiveAndEnabled) + throw new InvalidOperationException(@"VideoKit: Camera manager failed to start running because component is disabled"); + // Check + if (running) + return; + // Request camera permissions + var permissions = await CameraDevice.CheckPermissions(request: true); + if (permissions != MediaDevice.PermissionStatus.Authorized) + throw new InvalidOperationException(@"VideoKit: User did not grant camera permissions"); + // Discover devices + devices = await CameraDevice.Discover(); + // Check device + _device ??= GetDefaultCameraDevice(_facing); + if (_device == null) + throw new InvalidOperationException(@"VideoKit: Camera manager failed to start running because no camera device is available"); + // Configure camera + if (resolution != Resolution.Default) + _device.previewResolution = frameSize; + if (frameRate != FrameRate.Default) + _device.frameRate = (int)frameRate; + if (_device.FocusModeSupported(focusMode)) + _device.focusMode = focusMode; + if (_device.ExposureModeSupported(exposureMode)) + _device.exposureMode = exposureMode; + // Check capabilities + if (Application.platform == RuntimePlatform.WebGLPlayer) + capabilities |= Capabilities.AI; + // Create MatteKit predictor + if (capabilities.HasFlag(Capabilities.HumanTexture)) + matteKit ??= await MatteKitPredictor.Create(configuration: matteKitConfiguration); + // Create output + if (capabilities.HasFlag(Capabilities.AI)) { + var textureOutput = new TextureOutput(); + textureOutput.OnFrame += OnCameraTexture; + output = textureOutput; + } else { + var renderTextureOutput = new RenderTextureOutput(); + renderTextureOutput.OnFrame += OnCameraTexture; + output = renderTextureOutput; + } + // Start running + _device.StartRunning(UpdateCameraImage); + } + + /// + /// Stop the camera preview. + /// + public override void StopRunning () { + _device?.StopRunning(); + output?.Dispose(); + output = null; + } + #endregion + + + #region --Operations-- + private CameraDevice[] devices; + private CameraDevice _device; + private CameraOutput output; + private MatteKitPredictor matteKit; + private readonly MLEdgeModel.Configuration matteKitConfiguration = new () { + computeTarget = MLEdgeModel.ComputeTarget.CPU // don't ask, trust + }; + + private (int width, int height) frameSize => resolution switch { + VideoKitCameraManager.Resolution.Lowest => (176, 144), + VideoKitCameraManager.Resolution._640x480 => (640, 480), + VideoKitCameraManager.Resolution._1280x720 => (1280, 720), + VideoKitCameraManager.Resolution._1920x1080 => (1920, 1080), + VideoKitCameraManager.Resolution._4K => (3840, 2160), + VideoKitCameraManager.Resolution.Highest => (5120, 2880), + _ => (1280, 720), + }; + + private async void Awake () { + if (playOnAwake) + await StartRunning(); + } + + private void UpdateCameraImage (CameraImage image) { + output.Update(image); + OnCameraImage?.Invoke(image); + } + + private void OnCameraTexture (TextureOutput output) { + matteKit?.Predict(imageFeature); + OnCameraFrame?.Invoke(); + } + + private void OnCameraTexture (RenderTextureOutput output) => OnCameraFrame?.Invoke(); + + private void OnDestroy () { + StopRunning(); + matteKit?.Dispose(); + } + + internal (int width, int height) GetPreviewSize () => output switch { + TextureOutput o => (o.texture.width, o.texture.height), + RenderTextureOutput o => (o.texture.width, o.texture.height), + _ => default, + }; + + private CameraDevice GetDefaultCameraDevice (Facing facing) { + // Get fallback device + var fallback = !facing.HasFlag(Facing.RequireUser); + var fallbackDevice = fallback ? devices?.FirstOrDefault() : null; + // Find device + var frontFacing = !facing.HasFlag(Facing.PreferWorld); + var device = devices?.FirstOrDefault(d => frontFacing && d.frontFacing) ?? fallbackDevice; + // Return + return device; + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/VideoKitCameraManager.cs.meta b/Runtime/VideoKitCameraManager.cs.meta new file mode 100644 index 0000000..c966cc9 --- /dev/null +++ b/Runtime/VideoKitCameraManager.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 98622613d505143a9a06b4b11b0a82cf +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: 7728882fe6960415e897bae270b67d4e, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/VideoKitDeviceManager.cs b/Runtime/VideoKitDeviceManager.cs new file mode 100644 index 0000000..cf7cdd0 --- /dev/null +++ b/Runtime/VideoKitDeviceManager.cs @@ -0,0 +1,40 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +#nullable enable + +namespace VideoKit { + + using System.Threading.Tasks; + using UnityEngine; + using Devices; + + /// + /// VideoKit media device manager. + /// + public abstract class VideoKitDeviceManager : MonoBehaviour where T : MediaDevice { + + /// + /// Get or set the media device used for streaming. + /// If this is `null` the manager will find a suitable default to start streaming. + /// + public abstract T? device { get; set; } + + /// + /// Whether the device manager is streaming. + /// + public abstract bool running { get; } + + /// + /// Start streamming sample buffers from the current device. + /// + public abstract Task StartRunning (); + + /// + /// Stop streaming sample buffers from the current device. + /// + public abstract void StopRunning (); + } +} \ No newline at end of file diff --git a/Runtime/VideoKitDeviceManager.cs.meta b/Runtime/VideoKitDeviceManager.cs.meta new file mode 100644 index 0000000..3bae091 --- /dev/null +++ b/Runtime/VideoKitDeviceManager.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 093debbc96e6a4391a685151d3d8f297 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: 7728882fe6960415e897bae270b67d4e, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/VideoKitEditor.cs b/Runtime/VideoKitEditor.cs new file mode 100644 index 0000000..c997cf2 --- /dev/null +++ b/Runtime/VideoKitEditor.cs @@ -0,0 +1,56 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit { + + using System; + using UnityEngine; + + /// + /// VideoKit editor for editing videos. + /// + [Tooltip(@"VideoKit editor for editing videos.")] + [HelpURL(@"https://docs.videokit.ai/videokit/api/videokiteditor")] + [DisallowMultipleComponent] + [AddComponentMenu(@"")] // Hide this for now + internal sealed class VideoKitEditor : MonoBehaviour { + + #region --Enumerations-- + /// + /// Video editor capabilities. + /// + [Flags] + public enum Capabilities : int { + /// + /// Generate speech captions for the video. + /// + Captions = 0b001, + } + #endregion + + + #region --Inspector-- + /// + /// Auto-play the video. + /// + public bool autoPlay; + + /// + /// Mute any audio from the video. + /// + public bool mute; + #endregion + + + #region --Client API-- + + #endregion + + + #region --Operations-- + + #endregion + } +} \ No newline at end of file diff --git a/Runtime/VideoKitEditor.cs.meta b/Runtime/VideoKitEditor.cs.meta new file mode 100644 index 0000000..33a1c6c --- /dev/null +++ b/Runtime/VideoKitEditor.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: f8ac56017f2ce4c3a8271998d625a2c6 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: 7728882fe6960415e897bae270b67d4e, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/VideoKitExtensions.cs b/Runtime/VideoKitExtensions.cs new file mode 100644 index 0000000..dcf9b7d --- /dev/null +++ b/Runtime/VideoKitExtensions.cs @@ -0,0 +1,31 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit { + + using System.Linq; + + /// + /// VideoKit extension methods. + /// + public static class VideoKitExtensions { + + #region --Client API-- + /// + /// Whether this format supports recording video frames. + /// + public static bool SupportsVideo (this MediaFormat format) => new [] { + MediaFormat.MP4, MediaFormat.HEVC, MediaFormat.WEBM, MediaFormat.GIF, MediaFormat.JPEG + }.Contains(format); + + /// + /// Whether this format supports recording audio frames. + /// + public static bool SupportsAudio (this MediaFormat format) => new [] { + MediaFormat.MP4, MediaFormat.HEVC, MediaFormat.WEBM, MediaFormat.WAV + }.Contains(format); + #endregion + } +} \ No newline at end of file diff --git a/Runtime/VideoKitExtensions.cs.meta b/Runtime/VideoKitExtensions.cs.meta new file mode 100644 index 0000000..b5b7f50 --- /dev/null +++ b/Runtime/VideoKitExtensions.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: 15f40a20443b240d7a5f551ace9fa726 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: 7728882fe6960415e897bae270b67d4e, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/Runtime/VideoKitRecorder.cs b/Runtime/VideoKitRecorder.cs new file mode 100644 index 0000000..9ba31bf --- /dev/null +++ b/Runtime/VideoKitRecorder.cs @@ -0,0 +1,668 @@ +/* +* VideoKit +* Copyright © 2023 NatML Inc. All Rights Reserved. +*/ + +namespace VideoKit { + + using System; + using System.Collections.Generic; + using System.IO; + using System.Threading.Tasks; + using UnityEngine; + using UnityEngine.Events; + using Assets; + using Devices; + using Internal; + using Recorders; + using Recorders.Clocks; + using Recorders.Inputs; + + /// + /// VideoKit recorder for recording videos. + /// + [Tooltip(@"VideoKit recorder for recording videos.")] + [HelpURL(@"https://docs.videokit.ai/videokit/api/videokitrecorder")] + [DisallowMultipleComponent] + public sealed partial class VideoKitRecorder : MonoBehaviour { + + #region --Enumerations-- + /// + /// Video recording mode. + /// + public enum VideoMode : int { + /// + /// Don't record video. + /// + None = 0, + /// + /// Record video frames from one or more game cameras. + /// + Camera = 1, + /// + /// Record video frames from the screen. + /// + Screen = 2, + /// + /// Record video frames from a texture. + /// + Texture = 3, + /// + /// Record video frames from a camera device. + /// + CameraDevice = 4, + } + + /// + /// Audio recording mode. + /// + [Flags] + public enum AudioMode : int { + /// + /// Don't record audio. + /// + None = 0b00, + /// + /// Record audio frames from the scene's audio listener. + /// + AudioListener = 0b01, + /// + /// Record audio frames from an audio device. + /// + AudioDevice = 0b10, + } + + /// + /// Video recording resolution presets. + /// + public enum Resolution : int { + /// + /// QVGA resolution. + /// + _240xAuto = 11, + /// + /// QVGA resolution. + /// + _320xAuto = 5, + /// + /// HVGA resolution. + /// + _480xAuto = 6, + /// + /// SD resolution. + /// + _640xAuto = 0, + /// + /// Potrait HD resolution. + /// + _720xAuto = 7, + /// + /// Portrait Full HD resolution. + /// + _1080xAuto = 12, + /// + /// HD resolution. + /// + _1280xAuto = 1, + /// + /// Full HD resolution. + /// + _1920xAuto = 2, + /// + /// 2K WQHD resolution. + /// + _2560xAuto = 3, + /// + /// 4K UHD resolution. + /// + _3840xAuto = 4, + /// + /// Screen resolution. + /// + Screen = 9, + /// + /// Half of the screen resolution. + /// + HalfScreen = 10, + /// + /// Custom resolution. + /// + Custom = 8, + } + + /// + /// Recorder status. + /// + public enum Status : int { + /// + /// No recording session is in progress. + /// + Idle = 0, + /// + /// Recording session is in progress. + /// + Recording = 1, + /// + /// Recording session is in progress but is paused. + /// + Paused = 2, + } + + /// + /// Video watermark mode. + /// + public enum WatermarkMode : int { + /// + /// No watermark. + /// + None = 0, + /// + /// Place watermark in the bottom-left of the frame. + /// + BottomLeft = 1, + /// + /// Place watermark in the bottom-right of the frame. + /// + BottomRight = 2, + /// + /// Place watermark in the upper-left of the frame. + /// + UpperLeft = 3, + /// + /// Place watermark in the upper-right of the frame. + /// + UpperRight = 4, + /// + /// Place watermark in a user-defined rectangle. + /// Set the rect with the `watermarkRect` property. + /// + Custom = 5, + } + + /// + /// Recording action. + /// + [Flags] + public enum RecordingAction : int { + /// + /// Nothing. + /// + None = 0, + /// + /// Save the media asset to the camera roll. + /// + CameraRoll = 1 << 1, + /// + /// Prompt the user to share the media asset with the native sharing UI. + /// + Share = 1 << 2, + /// + /// Playback the video with the platform default media player. + /// + Playback = 1 << 3, + /// + /// Delete the media asset immediately. + /// + Delete = 1 << 4, + /// + /// Define a custom callback to receive the media asset. + /// NOTE: This is mutually exclusive with all other recording actions. + /// + Custom = 1 << 5, + } + #endregion + + + #region --Inspector-- + [Header(@"Format")] + /// + /// Recording format. + /// + [Tooltip(@"Recording format.")] + public MediaFormat format = MediaFormat.MP4; + + /// + /// Prepare the hardware encoders on awake. + /// This prevents a noticeable stutter that occurs on the very first recording. + /// + [Tooltip(@"Prepare the hardware encoders on awake. This prevents a noticeable stutter that occurs on the very first recording.")] + public bool prepareOnAwake = false; + + [Header(@"Video")] + /// + /// Video recording mode. + /// + [Tooltip(@"Video recording mode.")] + public VideoMode videoMode = VideoMode.Camera; + + /// + /// Video recording resolution. + /// + [Tooltip(@"Video recording resolution.")] + public Resolution resolution = Resolution._1280xAuto; + + /// + /// Video recording custom resolution. + /// This is only used when `resolution` is set to `Resolution.Custom`. + /// + [Tooltip(@"Video recording custom resolution.")] + public Vector2Int customResolution = new Vector2Int(1280, 720); + + /// + /// Game cameras to record. + /// + [Tooltip(@"Game cameras to record.")] + public Camera[] cameras; + + /// + /// Recording texture for recording video frames from a texture. + /// + [Tooltip(@"Recording texture for recording video frames from a texture.")] + public Texture texture; + + /// + /// Camera manager for recording video frames from a camera device. + /// + [Tooltip(@"Camera manager for recording video frames from a camera device.")] + public VideoKitCameraManager cameraManager; + + /// + /// Frame rate for animated GIF images. + /// This only applies when recording GIF images. + /// + [Tooltip(@"Frame rate for animated GIF images."), Range(5f, 30f)] + public float frameRate = 10f; + + /// + /// Number of successive camera frames to skip while recording. + /// + [Tooltip(@"Number of successive camera frames to skip while recording."), Range(0, 5)] + public int frameSkip = 0; + + [Header(@"Watermark")] + /// + /// Recording watermark mode for adding a watermark to videos. + /// + [Tooltip(@"Recording watermark mode for adding a watermark to videos.")] + public WatermarkMode watermarkMode = WatermarkMode.None; + + /// + /// Recording watermark. + /// + [Tooltip(@"Recording watermark.")] + public Texture watermark; + + /// + /// Watermark display rect when `watermarkMode` is set to `WatermarkMode.Custom`. + /// + [Tooltip(@"Watermark display rect when `watermarkMode` is set to `WatermarkMode.Custom`")] + public RectInt watermarkRect; + + [Header(@"Audio")] + /// + /// Audio recording mode. + /// + [Tooltip(@"Audio recording mode.")] + public AudioMode audioMode = AudioMode.None; + + /// + /// Audio manager for recording audio sample buffers from an audio device. + /// + [Tooltip(@"Audio manager for recording audio sample buffers from an audio device.")] + public VideoKitAudioManager audioManager; + + /// + /// Whether the recorder can configure the audio manager for recording. + /// Unless you intend to override the audio manager configuration, leave this `true`. + /// + [Tooltip(@"Whether the recorder can configure the audio manager for recording.")] + public bool configureAudioManager = true; + + /// + /// Audio device gain when recording both game and microphone audio. + /// + [Tooltip(@"Audio device gain when recording both game and microphone audio."), Range(1f, 5f)] + public float audioDeviceGain = 2f; + + [Header(@"Recording")] + /// + /// Recording action. + /// + [Tooltip(@"Recording action.")] + public RecordingAction recordingAction = 0; + + /// + /// Event raised when a recording session is completed. + /// + [Tooltip(@"Event raised when a recording session is completed.")] + public UnityEvent OnRecordingCompleted; + #endregion + + + #region --Client API-- + /// + /// Recording path prefix when saving recordings to the app's documents. + /// + [HideInInspector] + public string mediaPathPrefix = @"recordings"; + + /// + /// Video bit rate in bits per second. + /// + [HideInInspector] + public int videoBitRate = 10_000_000; + + /// + /// Video keyframe interval in seconds. + /// + [HideInInspector] + public int keyframeInterval = 2; + + /// + /// Audio bit rate in bits per second. + /// + [HideInInspector] + public int audioBitRate = 64_000; + + /// + /// Recorder status. + /// + public Status status => clock?.paused switch { + true => Status.Paused, + false => Status.Recording, + null => Status.Idle, + }; + + /// + /// Start recording. + /// + public async Task StartRecording () { + // Check active + if (!isActiveAndEnabled) + throw new InvalidOperationException(@"VideoKitRecorder cannot start recording because component is disabled"); + // Check status + if (status != Status.Idle) + throw new InvalidOperationException(@"VideoKitRecorder cannot start recording because a recording session is already in progress"); + // Check camera device mode + if (videoMode == VideoMode.CameraDevice) { + // Check camera manager + if (!cameraManager) + throw new InvalidOperationException(@"VideoKitRecorder cannot start recording because the video mode is set to `VideoMode.CameraDevice` but `cameraManager` is null"); + // Check camera preview + if (!cameraManager.running) + throw new InvalidOperationException(@"VideoKitRecorder cannot start recording because the video mode is set to `VideoMode.CameraDevice` but the `cameraManager` is not running"); + } + // Check audio mode + if (audioMode.HasFlag(AudioMode.AudioListener) && Application.platform == RuntimePlatform.WebGLPlayer) { + Debug.LogWarning(@"VideoKitRecorder cannot record audio from AudioListener because WebGL does not support `OnAudioFilterRead`"); + audioMode &= ~AudioMode.AudioListener; + } + // Check audio device + if (audioMode.HasFlag(AudioMode.AudioDevice)) { + // Check audio manager + if (!audioManager) + throw new InvalidOperationException(@"VideoKitRecorder cannot start recording because the audio mode includes `AudioMode.AudioDevice` but `audioManager` is null"); + // Configure audio manager + if (configureAudioManager) { + // Set format + if (audioMode.HasFlag(AudioMode.AudioListener)) { + audioManager.sampleRate = VideoKitAudioManager.SampleRate.MatchUnity; + audioManager.channelCount = VideoKitAudioManager.ChannelCount.MatchUnity; + } + // Start running + await audioManager.StartRunning(); + } + // Check audio stream + if (!audioManager.running) + throw new InvalidOperationException(@"VideoKitRecorder cannot start recording because the audio mode includes to `AudioMode.AudioDevice` but the `audioManager` is not running"); + } + // Check format + if (format == MediaFormat.MP4 && Application.platform == RuntimePlatform.WebGLPlayer) { + format = MediaFormat.WEBM; + Debug.LogWarning(@"VideoKitRecorder will use WEBM format on WebGL because MP4 is not supported"); + } + // Create recorder + var (width, height) = CreateVideoFormat(); + var (sampleRate, channelCount) = CreateAudioFormat(); + recorder = await MediaRecorder.Create( + format, + width: width, + height: height, + frameRate: format == MediaFormat.GIF ? frameRate : 30, + sampleRate: sampleRate, + channelCount: channelCount, + videoBitRate: videoBitRate, + keyframeInterval: keyframeInterval, + compressionQuality: 0.8f, + audioBitRate: audioBitRate, + prefix: mediaPathPrefix + ); + // Create inputs + clock = new RealtimeClock(); + videoInput = CreateVideoInput(); + audioInput = CreateAudioInput(); + } + + /// + /// Pause recording. + /// + public void PauseRecording () { + // Check + if (status != Status.Recording) { + Debug.LogError(@"Cannot pause recording because no recording session is in progress"); + return; + } + // Stop audio manager + if (configureAudioManager && audioManager) + audioManager.StopRunning(); + // Dispose inputs + videoInput?.Dispose(); // this implicitly disposes the `textureInput`, perhaps not the best API design + audioInput?.Dispose(); + videoInput = null; + audioInput = null; + // Pause clock + clock.paused = true; + } + + /// + /// Resume recording. + /// + public async void ResumeRecording () { + // Check status + if (status != Status.Paused) { + Debug.LogError(@"Cannot resume recording because the recording session is not paused"); + return; + } + // Check active + if (!isActiveAndEnabled) { + Debug.LogError(@"Cannot resume recording because component is disabled"); + return; + } + // Check audio manager + if (configureAudioManager && audioManager) + await audioManager.StartRunning(); + // Unpause clock + clock.paused = false; + // Create inputs + videoInput = CreateVideoInput(); + audioInput = CreateAudioInput(); + } + + /// + /// Stop recording. + /// + public async Task StopRecording () { + // Check + if (status == Status.Idle) { + Debug.LogWarning(@"Cannot stop recording because no recording session is in progress"); + return; + } + // Stop audio manager + if (configureAudioManager && audioManager) + audioManager.StopRunning(); + // Stop inputs + audioInput?.Dispose(); + videoInput?.Dispose(); + videoInput = null; + audioInput = null; + clock = null; + // Stop recording + var recordingPath = await recorder.FinishWriting(); + var mediaAsset = await MediaAsset.FromFile(recordingPath); + // Check that this is not result of disabling + if (!isActiveAndEnabled) { + mediaAsset.Delete(); + return; + } + // Post action + if (recordingAction.HasFlag(RecordingAction.Custom)) + OnRecordingCompleted?.Invoke(mediaAsset); + if (recordingAction.HasFlag(RecordingAction.CameraRoll)) + await mediaAsset.SaveToCameraRoll(); + if (recordingAction.HasFlag(RecordingAction.Share)) + await mediaAsset.Share(); + if (recordingAction.HasFlag(RecordingAction.Playback) && mediaAsset is VideoAsset videoAsset) + videoAsset.Playback(); + if (recordingAction.HasFlag(RecordingAction.Delete)) + mediaAsset.Delete(); + } + #endregion + + + #region --Operations-- + private MediaRecorder recorder; + private RealtimeClock clock; + private IDisposable videoInput; + private IDisposable audioInput; + + private void Reset () { + cameras = Camera.allCameras; + cameraManager = FindObjectOfType(); + audioManager = FindObjectOfType(); + } + + private async void Awake () { + if (prepareOnAwake) + await PrepareEncoder(); + } + + private async void OnDestroy () { + if (status != Status.Idle) + await StopRecording(); + } + + private (int width, int height) CreateVideoFormat () { + // Custom resolution + if (resolution == Resolution.Custom) + return (customResolution.x, customResolution.y); + // Screen resolution + if (resolution == Resolution.Screen) + return (Screen.width >> 1 << 1, Screen.height >> 1 << 1); + // Half screen resolution + if (resolution == Resolution.HalfScreen) + return (Screen.width >> 2 << 1, Screen.height >> 2 << 1); + // Get video size + var width = GetVideoWidth(); + var aspect = GetVideoAspect(); + var height = Mathf.RoundToInt(width / aspect) >> 1 << 1; + return (width, height); + } + + private int GetVideoWidth () => resolution switch { + Resolution._240xAuto => 240, + Resolution._320xAuto => 320, + Resolution._480xAuto => 480, + Resolution._640xAuto => 640, + Resolution._720xAuto => 720, + Resolution._1080xAuto => 1080, + Resolution._1280xAuto => 1280, + Resolution._1920xAuto => 1920, + Resolution._2560xAuto => 2560, + Resolution._3840xAuto => 3840, + _ => 1280, + }; + + private float GetVideoAspect () => videoMode switch { + VideoMode.Camera => (float)Screen.width / Screen.height, + VideoMode.Screen => (float)Screen.width / Screen.height, + VideoMode.Texture => (float)texture.width / texture.height, + VideoMode.CameraDevice => (float)cameraManager.GetPreviewSize().width / cameraManager.GetPreviewSize().height, + _ => 1f, + }; + + private (int sampleRate, int channelCount) CreateAudioFormat () => audioMode switch { + var mode when mode.HasFlag(AudioMode.AudioListener) => (AudioSettings.outputSampleRate, (int)AudioSettings.speakerMode), + AudioMode.AudioDevice => (audioManager?.device?.sampleRate ?? 0, audioManager?.device?.channelCount ?? 0), + _ => (0, 0), + }; + + private RecorderTextureInput CreateTextureInput () => new RecorderTextureInput(recorder, new [] { // no side effects + new WatermarkTextureInput(null as TextureInput) { // MUST be `TextureInput` + watermark = watermarkMode != WatermarkMode.None ? watermark : null, + rect = CreateWatermarkRect(recorder, watermarkMode, watermarkRect), + }, + }); + + private IDisposable CreateVideoInput () => videoMode switch { // no side effects + var x when !format.SupportsVideo() => null, + VideoMode.Screen => new ScreenInput(CreateTextureInput(), clock) { frameSkip = frameSkip }, + VideoMode.Camera => new CameraInput(CreateTextureInput(), clock, cameras) { frameSkip = frameSkip }, + VideoMode.Texture => new CustomTextureInput(CreateTextureInput(), clock, texture) { frameSkip = frameSkip }, + VideoMode.CameraDevice => new CameraDeviceInput(CreateTextureInput(), clock, cameraManager) { frameSkip = frameSkip }, + _ => null, + }; + + private IDisposable CreateAudioInput () => audioMode switch { // no side effects + var x when !format.SupportsAudio() => null, + var x when x.HasFlag(AudioMode.AudioDevice | AudioMode.AudioListener) => new AudioMixerInput(recorder, clock, audioManager, FindObjectOfType()) { audioDeviceGain = audioDeviceGain }, + AudioMode.AudioListener => new AudioInput(recorder, clock, FindObjectOfType()), + AudioMode.AudioDevice => new AudioDeviceInput(recorder, clock, audioManager), + _ => null, + }; + #endregion + + + #region --Utility-- + + private static RectInt CreateWatermarkRect (MediaRecorder recorder, WatermarkMode mode, RectInt customRect) { + // Check none + if (mode == WatermarkMode.None) + return default; + // Check custom + if (mode == WatermarkMode.Custom) + return customRect; + // Construct rect + var (width, height) = recorder.frameSize; + var NormalizedPositions = new Dictionary { + [WatermarkMode.BottomLeft] = new Vector2(0.2f, 0.15f), + [WatermarkMode.BottomRight] = new Vector2(0.8f, 0.15f), + [WatermarkMode.UpperLeft] = new Vector2(0.2f, 0.85f), + [WatermarkMode.UpperRight] = new Vector2(0.8f, 0.85f), + }; + var normalizedPosition = NormalizedPositions[mode]; + var position = Vector2Int.RoundToInt(Vector2.Scale(normalizedPosition, new Vector2(width, height))); + var size = Mathf.RoundToInt(0.15f * Mathf.Max(width, height)); + var rect = new RectInt(position.x - size / 1, position.y - size / 1, size, size); + return rect; + } + + private static async Task PrepareEncoder () { + // Check platform + if (Application.platform == RuntimePlatform.WebGLPlayer) + return; + // Create recorder + var width = 1280; + var height = 720; + var clock = new FixedIntervalClock(30); + var recorder = await MediaRecorder.Create(MediaFormat.MP4, width: width, height: height, frameRate: 30); + // Commit empty frames + var pixelBuffer = new byte[width * height * 4]; + for (var i = 0; i < 3; ++i) + recorder.CommitFrame(pixelBuffer, clock.timestamp); + // Finis and delete + var path = await recorder.FinishWriting(); + File.Delete(path); + } + #endregion + } +} \ No newline at end of file diff --git a/Runtime/VideoKitRecorder.cs.meta b/Runtime/VideoKitRecorder.cs.meta new file mode 100644 index 0000000..3c06e54 --- /dev/null +++ b/Runtime/VideoKitRecorder.cs.meta @@ -0,0 +1,11 @@ +fileFormatVersion: 2 +guid: e0fffa95ffc0b4c7a87ce2d4c7fd1873 +MonoImporter: + externalObjects: {} + serializedVersion: 2 + defaultReferences: [] + executionOrder: 0 + icon: {fileID: 2800000, guid: 7728882fe6960415e897bae270b67d4e, type: 3} + userData: + assetBundleName: + assetBundleVariant: diff --git a/UI.meta b/UI.meta new file mode 100644 index 0000000..9fb7f6d --- /dev/null +++ b/UI.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 99365d207025e47a9859a2470238576c +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/UI/Textures.meta b/UI/Textures.meta new file mode 100644 index 0000000..3f1b04e --- /dev/null +++ b/UI/Textures.meta @@ -0,0 +1,8 @@ +fileFormatVersion: 2 +guid: 148c5f4d8ff7048aaa44bb0fd30a76a9 +folderAsset: yes +DefaultImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: diff --git a/UI/Textures/button.png b/UI/Textures/button.png new file mode 100644 index 0000000..a6a00d3 Binary files /dev/null and b/UI/Textures/button.png differ diff --git a/UI/Textures/button.png.meta b/UI/Textures/button.png.meta new file mode 100644 index 0000000..c1a5384 --- /dev/null +++ b/UI/Textures/button.png.meta @@ -0,0 +1,139 @@ +fileFormatVersion: 2 +guid: 305f11ed3722e4a11a8a0327c3364af5 +TextureImporter: + internalIDToNameTable: [] + externalObjects: {} + serializedVersion: 10 + mipmaps: + mipMapMode: 0 + enableMipMap: 1 + sRGBTexture: 1 + linearTexture: 0 + fadeOut: 0 + borderMipMap: 0 + mipMapsPreserveCoverage: 0 + alphaTestReferenceValue: 0.5 + mipMapFadeDistanceStart: 1 + mipMapFadeDistanceEnd: 3 + bumpmap: + convertToNormalMap: 0 + externalNormalMap: 0 + heightScale: 0.25 + normalMapFilter: 0 + isReadable: 0 + streamingMipmaps: 0 + streamingMipmapsPriority: 0 + grayScaleToAlpha: 0 + generateCubemap: 6 + cubemapConvolution: 0 + seamlessCubemap: 0 + textureFormat: 1 + maxTextureSize: 2048 + textureSettings: + serializedVersion: 2 + filterMode: -1 + aniso: 16 + mipBias: -100 + wrapU: -1 + wrapV: -1 + wrapW: -1 + nPOTScale: 0 + lightmap: 0 + compressionQuality: 50 + spriteMode: 1 + spriteExtrude: 1 + spriteMeshType: 1 + alignment: 0 + spritePivot: {x: 0.5, y: 0.5} + spritePixelsToUnits: 100 + spriteBorder: {x: 0, y: 0, z: 0, w: 0} + spriteGenerateFallbackPhysicsShape: 1 + alphaUsage: 1 + alphaIsTransparency: 1 + spriteTessellationDetail: -1 + textureType: 8 + textureShape: 1 + singleChannelComponent: 0 + maxTextureSizeSet: 0 + compressionQualitySet: 0 + textureFormatSet: 0 + platformSettings: + - serializedVersion: 3 + buildTarget: DefaultTexturePlatform + maxTextureSize: 512 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 0 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 1 + - serializedVersion: 3 + buildTarget: Standalone + maxTextureSize: 512 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 0 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 1 + - serializedVersion: 3 + buildTarget: iPhone + maxTextureSize: 512 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 0 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 1 + - serializedVersion: 3 + buildTarget: Android + maxTextureSize: 512 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 0 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 1 + - serializedVersion: 3 + buildTarget: WebGL + maxTextureSize: 512 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 0 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 1 + spriteSheet: + serializedVersion: 2 + sprites: [] + outline: [] + physicsShape: [] + bones: [] + spriteID: 5e97eb03825dee720800000000000000 + internalID: 0 + vertices: [] + indices: + edges: [] + weights: [] + secondaryTextures: [] + spritePackingTag: + pSDRemoveMatte: 0 + pSDShowRemoveMatteOption: 0 + userData: + assetBundleName: + assetBundleVariant: diff --git a/UI/Textures/circle.png b/UI/Textures/circle.png new file mode 100644 index 0000000..94c5b5a Binary files /dev/null and b/UI/Textures/circle.png differ diff --git a/UI/Textures/circle.png.meta b/UI/Textures/circle.png.meta new file mode 100644 index 0000000..44bbedc --- /dev/null +++ b/UI/Textures/circle.png.meta @@ -0,0 +1,139 @@ +fileFormatVersion: 2 +guid: 6ef7ae2208e6c44fba2978f2ebe42b25 +TextureImporter: + internalIDToNameTable: [] + externalObjects: {} + serializedVersion: 10 + mipmaps: + mipMapMode: 0 + enableMipMap: 1 + sRGBTexture: 1 + linearTexture: 0 + fadeOut: 0 + borderMipMap: 0 + mipMapsPreserveCoverage: 0 + alphaTestReferenceValue: 0.5 + mipMapFadeDistanceStart: 1 + mipMapFadeDistanceEnd: 3 + bumpmap: + convertToNormalMap: 0 + externalNormalMap: 0 + heightScale: 0.25 + normalMapFilter: 0 + isReadable: 0 + streamingMipmaps: 0 + streamingMipmapsPriority: 0 + grayScaleToAlpha: 0 + generateCubemap: 6 + cubemapConvolution: 0 + seamlessCubemap: 0 + textureFormat: 1 + maxTextureSize: 2048 + textureSettings: + serializedVersion: 2 + filterMode: -1 + aniso: 16 + mipBias: -100 + wrapU: -1 + wrapV: -1 + wrapW: -1 + nPOTScale: 0 + lightmap: 0 + compressionQuality: 50 + spriteMode: 1 + spriteExtrude: 1 + spriteMeshType: 1 + alignment: 0 + spritePivot: {x: 0.5, y: 0.5} + spritePixelsToUnits: 100 + spriteBorder: {x: 0, y: 0, z: 0, w: 0} + spriteGenerateFallbackPhysicsShape: 1 + alphaUsage: 1 + alphaIsTransparency: 1 + spriteTessellationDetail: -1 + textureType: 8 + textureShape: 1 + singleChannelComponent: 0 + maxTextureSizeSet: 0 + compressionQualitySet: 0 + textureFormatSet: 0 + platformSettings: + - serializedVersion: 3 + buildTarget: DefaultTexturePlatform + maxTextureSize: 512 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 0 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 1 + - serializedVersion: 3 + buildTarget: Standalone + maxTextureSize: 512 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 0 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 1 + - serializedVersion: 3 + buildTarget: iPhone + maxTextureSize: 512 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 0 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 1 + - serializedVersion: 3 + buildTarget: Android + maxTextureSize: 512 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 0 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 1 + - serializedVersion: 3 + buildTarget: WebGL + maxTextureSize: 512 + resizeAlgorithm: 0 + textureFormat: -1 + textureCompression: 0 + compressionQuality: 50 + crunchedCompression: 0 + allowsAlphaSplitting: 0 + overridden: 0 + androidETC2FallbackOverride: 0 + forceMaximumCompressionQuality_BC6H_BC7: 1 + spriteSheet: + serializedVersion: 2 + sprites: [] + outline: [] + physicsShape: [] + bones: [] + spriteID: 5e97eb03825dee720800000000000000 + internalID: 0 + vertices: [] + indices: + edges: [] + weights: [] + secondaryTextures: [] + spritePackingTag: + pSDRemoveMatte: 0 + pSDShowRemoveMatteOption: 0 + userData: + assetBundleName: + assetBundleVariant: diff --git a/UI/VideoKitRecordButton.prefab b/UI/VideoKitRecordButton.prefab new file mode 100644 index 0000000..8475598 --- /dev/null +++ b/UI/VideoKitRecordButton.prefab @@ -0,0 +1,312 @@ +%YAML 1.1 +%TAG !u! tag:unity3d.com,2011: +--- !u!1 &1624349787801622 +GameObject: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + serializedVersion: 6 + m_Component: + - component: {fileID: 224048532206575024} + - component: {fileID: 222505747289298574} + - component: {fileID: 114734979408368796} + m_Layer: 5 + m_Name: Countdown + m_TagString: Untagged + m_Icon: {fileID: 0} + m_NavMeshLayer: 0 + m_StaticEditorFlags: 0 + m_IsActive: 1 +--- !u!224 &224048532206575024 +RectTransform: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + m_GameObject: {fileID: 1624349787801622} + m_LocalRotation: {x: -0, y: -0, z: -0, w: 1} + m_LocalPosition: {x: 0, y: 0, z: 0} + m_LocalScale: {x: 1, y: 1, z: 1} + m_ConstrainProportionsScale: 0 + m_Children: [] + m_Father: {fileID: 224991663417966582} + m_RootOrder: 0 + m_LocalEulerAnglesHint: {x: 0, y: 0, z: 0} + m_AnchorMin: {x: 0, y: 0} + m_AnchorMax: {x: 1, y: 1} + m_AnchoredPosition: {x: 0, y: 0} + m_SizeDelta: {x: 0, y: 0} + m_Pivot: {x: 0.5, y: 0.5} +--- !u!222 &222505747289298574 +CanvasRenderer: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + m_GameObject: {fileID: 1624349787801622} + m_CullTransparentMesh: 1 +--- !u!114 &114734979408368796 +MonoBehaviour: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + m_GameObject: {fileID: 1624349787801622} + m_Enabled: 1 + m_EditorHideFlags: 0 + m_Script: {fileID: 11500000, guid: fe87c0e1cc204ed48ad3b37840f39efc, type: 3} + m_Name: + m_EditorClassIdentifier: + m_Material: {fileID: 0} + m_Color: {r: 0.9411765, g: 0, b: 0, a: 1} + m_RaycastTarget: 1 + m_RaycastPadding: {x: 0, y: 0, z: 0, w: 0} + m_Maskable: 1 + m_OnCullStateChanged: + m_PersistentCalls: + m_Calls: [] + m_Sprite: {fileID: 21300000, guid: 305f11ed3722e4a11a8a0327c3364af5, type: 3} + m_Type: 3 + m_PreserveAspect: 1 + m_FillCenter: 1 + m_FillMethod: 4 + m_FillAmount: 0.28 + m_FillClockwise: 1 + m_FillOrigin: 2 + m_UseSpriteMesh: 0 + m_PixelsPerUnitMultiplier: 1 +--- !u!1 &1685906967458926 +GameObject: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + serializedVersion: 6 + m_Component: + - component: {fileID: 224333668068116212} + - component: {fileID: 222231895854482366} + - component: {fileID: 114009954537193390} + m_Layer: 5 + m_Name: Circle + m_TagString: Untagged + m_Icon: {fileID: 0} + m_NavMeshLayer: 0 + m_StaticEditorFlags: 0 + m_IsActive: 1 +--- !u!224 &224333668068116212 +RectTransform: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + m_GameObject: {fileID: 1685906967458926} + m_LocalRotation: {x: -0, y: -0, z: -0, w: 1} + m_LocalPosition: {x: 0, y: 0, z: 0} + m_LocalScale: {x: 1, y: 1, z: 1} + m_ConstrainProportionsScale: 0 + m_Children: [] + m_Father: {fileID: 224991663417966582} + m_RootOrder: 1 + m_LocalEulerAnglesHint: {x: 0, y: 0, z: 0} + m_AnchorMin: {x: 0, y: 0} + m_AnchorMax: {x: 1, y: 1} + m_AnchoredPosition: {x: 0, y: 0} + m_SizeDelta: {x: -39.63, y: -37.71} + m_Pivot: {x: 0.5, y: 0.5} +--- !u!222 &222231895854482366 +CanvasRenderer: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + m_GameObject: {fileID: 1685906967458926} + m_CullTransparentMesh: 1 +--- !u!114 &114009954537193390 +MonoBehaviour: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + m_GameObject: {fileID: 1685906967458926} + m_Enabled: 1 + m_EditorHideFlags: 0 + m_Script: {fileID: 11500000, guid: fe87c0e1cc204ed48ad3b37840f39efc, type: 3} + m_Name: + m_EditorClassIdentifier: + m_Material: {fileID: 0} + m_Color: {r: 1, g: 1, b: 1, a: 1} + m_RaycastTarget: 1 + m_RaycastPadding: {x: 0, y: 0, z: 0, w: 0} + m_Maskable: 1 + m_OnCullStateChanged: + m_PersistentCalls: + m_Calls: [] + m_Sprite: {fileID: 21300000, guid: 6ef7ae2208e6c44fba2978f2ebe42b25, type: 3} + m_Type: 0 + m_PreserveAspect: 1 + m_FillCenter: 1 + m_FillMethod: 4 + m_FillAmount: 1 + m_FillClockwise: 1 + m_FillOrigin: 0 + m_UseSpriteMesh: 0 + m_PixelsPerUnitMultiplier: 1 +--- !u!1 &1961369732989660 +GameObject: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + serializedVersion: 6 + m_Component: + - component: {fileID: 224991663417966582} + - component: {fileID: 222115632752838930} + - component: {fileID: 114130223117990098} + - component: {fileID: 114702512721743734} + - component: {fileID: 114727111200153766} + - component: {fileID: 114316963683169970} + m_Layer: 5 + m_Name: VideoKitRecordButton + m_TagString: Untagged + m_Icon: {fileID: 2800000, guid: d6df589e0376343f7a3b406329630a74, type: 3} + m_NavMeshLayer: 0 + m_StaticEditorFlags: 0 + m_IsActive: 1 +--- !u!224 &224991663417966582 +RectTransform: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + m_GameObject: {fileID: 1961369732989660} + m_LocalRotation: {x: -0, y: -0, z: -0, w: 1} + m_LocalPosition: {x: 0, y: 0, z: 0} + m_LocalScale: {x: 1, y: 1, z: 1} + m_ConstrainProportionsScale: 0 + m_Children: + - {fileID: 224048532206575024} + - {fileID: 224333668068116212} + m_Father: {fileID: 0} + m_RootOrder: 0 + m_LocalEulerAnglesHint: {x: 0, y: 0, z: 0} + m_AnchorMin: {x: 0.5, y: 0} + m_AnchorMax: {x: 0.5, y: 0} + m_AnchoredPosition: {x: 0, y: 45.469727} + m_SizeDelta: {x: 148.4, y: 147} + m_Pivot: {x: 0.5, y: 0} +--- !u!222 &222115632752838930 +CanvasRenderer: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + m_GameObject: {fileID: 1961369732989660} + m_CullTransparentMesh: 1 +--- !u!114 &114130223117990098 +MonoBehaviour: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + m_GameObject: {fileID: 1961369732989660} + m_Enabled: 1 + m_EditorHideFlags: 0 + m_Script: {fileID: 11500000, guid: d0b148fe25e99eb48b9724523833bab1, type: 3} + m_Name: + m_EditorClassIdentifier: + m_Delegates: [] +--- !u!114 &114702512721743734 +MonoBehaviour: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + m_GameObject: {fileID: 1961369732989660} + m_Enabled: 1 + m_EditorHideFlags: 0 + m_Script: {fileID: 11500000, guid: fe87c0e1cc204ed48ad3b37840f39efc, type: 3} + m_Name: + m_EditorClassIdentifier: + m_Material: {fileID: 0} + m_Color: {r: 1, g: 1, b: 1, a: 0.097} + m_RaycastTarget: 1 + m_RaycastPadding: {x: 0, y: 0, z: 0, w: 0} + m_Maskable: 1 + m_OnCullStateChanged: + m_PersistentCalls: + m_Calls: [] + m_Sprite: {fileID: 21300000, guid: 305f11ed3722e4a11a8a0327c3364af5, type: 3} + m_Type: 3 + m_PreserveAspect: 1 + m_FillCenter: 1 + m_FillMethod: 4 + m_FillAmount: 0.72 + m_FillClockwise: 0 + m_FillOrigin: 2 + m_UseSpriteMesh: 0 + m_PixelsPerUnitMultiplier: 1 +--- !u!114 &114727111200153766 +MonoBehaviour: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + m_GameObject: {fileID: 1961369732989660} + m_Enabled: 1 + m_EditorHideFlags: 0 + m_Script: {fileID: 11500000, guid: 4e29b1a8efbd4b44bb3f3716e73f07ff, type: 3} + m_Name: + m_EditorClassIdentifier: + m_Navigation: + m_Mode: 3 + m_WrapAround: 0 + m_SelectOnUp: {fileID: 0} + m_SelectOnDown: {fileID: 0} + m_SelectOnLeft: {fileID: 0} + m_SelectOnRight: {fileID: 0} + m_Transition: 1 + m_Colors: + m_NormalColor: {r: 1, g: 1, b: 1, a: 1} + m_HighlightedColor: {r: 0.9607843, g: 0.9607843, b: 0.9607843, a: 1} + m_PressedColor: {r: 0.78431374, g: 0.78431374, b: 0.78431374, a: 1} + m_SelectedColor: {r: 0.9607843, g: 0.9607843, b: 0.9607843, a: 1} + m_DisabledColor: {r: 0.78431374, g: 0.78431374, b: 0.78431374, a: 0.5019608} + m_ColorMultiplier: 1 + m_FadeDuration: 0.1 + m_SpriteState: + m_HighlightedSprite: {fileID: 0} + m_PressedSprite: {fileID: 0} + m_SelectedSprite: {fileID: 0} + m_DisabledSprite: {fileID: 0} + m_AnimationTriggers: + m_NormalTrigger: Normal + m_HighlightedTrigger: Highlighted + m_PressedTrigger: Pressed + m_SelectedTrigger: Highlighted + m_DisabledTrigger: Disabled + m_Interactable: 1 + m_TargetGraphic: {fileID: 114009954537193390} + m_OnClick: + m_PersistentCalls: + m_Calls: [] +--- !u!114 &114316963683169970 +MonoBehaviour: + m_ObjectHideFlags: 0 + m_CorrespondingSourceObject: {fileID: 0} + m_PrefabInstance: {fileID: 0} + m_PrefabAsset: {fileID: 0} + m_GameObject: {fileID: 1961369732989660} + m_Enabled: 1 + m_EditorHideFlags: 0 + m_Script: {fileID: 11500000, guid: 0ac74b706104a475c98f83f30de96193, type: 3} + m_Name: + m_EditorClassIdentifier: + maxDuration: 10 + countdown: {fileID: 114734979408368796} + OnTouchDown: + m_PersistentCalls: + m_Calls: [] + OnTouchUp: + m_PersistentCalls: + m_Calls: [] diff --git a/UI/VideoKitRecordButton.prefab.meta b/UI/VideoKitRecordButton.prefab.meta new file mode 100644 index 0000000..5f2c3cd --- /dev/null +++ b/UI/VideoKitRecordButton.prefab.meta @@ -0,0 +1,10 @@ +fileFormatVersion: 2 +guid: d88712a49642b4b49b8322acc70b918b +timeCreated: 1525456370 +licenseType: Pro +NativeFormatImporter: + externalObjects: {} + mainObjectFileID: 100100000 + userData: + assetBundleName: + assetBundleVariant: diff --git a/package.json b/package.json new file mode 100644 index 0000000..d2f46d8 --- /dev/null +++ b/package.json @@ -0,0 +1,30 @@ +{ + "name": "ai.natml.videokit", + "version": "0.0.17", + "displayName": "VideoKit", + "description": "The only user-generated content solution for Unity Engine.", + "unity": "2022.3", + "dependencies": { + "ai.natml.natml": "1.1.16", + "ai.fxn.fxn3d": "0.0.4" + }, + "keywords": [ + "natml", + "function", + "fxn", + "unity3d", + "recording", + "sharing", + "camera", + "microphone", + "ai", + "natshare" + ], + "author": { + "name": "NatML", + "email": "hi@natml.ai", + "url": "https://github.com/natmlx" + }, + "license": "Apache-2.0", + "repository": "github:natmlx/videokit" +} \ No newline at end of file diff --git a/package.json.meta b/package.json.meta new file mode 100644 index 0000000..f44008a --- /dev/null +++ b/package.json.meta @@ -0,0 +1,7 @@ +fileFormatVersion: 2 +guid: c7e983d53222c4c8384d9133d8af03ff +PackageManifestImporter: + externalObjects: {} + userData: + assetBundleName: + assetBundleVariant: