Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[AVFoundation] Fix delegate signature for AVAudioSourceNode callbacks. #20357

Merged
merged 4 commits into from
Apr 9, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
165 changes: 165 additions & 0 deletions src/AVFoundation/AVAudioSourceNode.cs
Original file line number Diff line number Diff line change
@@ -0,0 +1,165 @@
using System;
using System.ComponentModel;
using System.Runtime.CompilerServices;

using AudioToolbox;
using ObjCRuntime;

namespace AVFoundation {

// This is messy...
// * The good delegate is:
// - AVAudioSourceNodeRenderHandler in XAMCORE_5_0
// - AVAudioSourceNodeRenderHandler3 otherwise
// * There are two legacy delegates:
// - AVAudioSourceNodeRenderHandler in .NET and legacy Xamarin
// - AVAudioSourceNodeRenderHandler2 in legacy Xamarin.
//
// History of mistakes:
//
// 1. We first bound this using AVAudioSourceNodeRenderHandler (legacy Xamarin version), which was wrong.
// 2. We then found the mistake, and bound it as AVAudioSourceNodeRenderHandler2 in legacy Xamarin, removed the initial version in .NET, and named it AVAudioSourceNodeRenderHandler in .NET.
// * However, we failed to write tests, so this delegate was broken too.
// 3. Finally we got a customer report, and realized the new delegate was broken too. So now there are two broken delegates, and one working (AVAudioSourceNodeRenderHandler3 in .NET and legacy Xamarin, named as AVAudioSourceNodeRenderHandler in XAMCORE_5_0).
// * And tests were added too.
//
// Note: broken = made to work with a workaround, which makes this even messier.
//

/// <summary>The delegate that will be called in a callback from <see cref="T:AudioToolbox.AVAudioSourceNode" />.</summary>
/// <returns>An OSStatus result code. Return 0 to indicate success.</returns>
/// <param name="isSilence">Indicates whether the supplied audio data only contains silence.</param>
/// <param name="timestamp">The timestamp the audio renders (HAL time).</param>
/// <param name="frameCount">The number of frames of audio to supply.</param>
/// <param name="outputData">The <see cref="T:AudioToolbox.AudioBuffers" /> that contains the supplied audio data when the callback returns.</param>
#if XAMCORE_5_0
public delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler (ref bool isSilence, ref AudioTimeStamp timestamp, uint frameCount, AudioBuffers outputData);
#else
public delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler3 (ref bool isSilence, ref AudioTimeStamp timestamp, uint frameCount, AudioBuffers outputData);
#endif

#if !XAMCORE_5_0
#if NET
[EditorBrowsable (EditorBrowsableState.Never)]
public delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler (ref bool isSilence, ref AudioTimeStamp timestamp, uint frameCount, ref AudioBuffers outputData);
#else
[EditorBrowsable (EditorBrowsableState.Never)]
public delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler (bool isSilence, AudioToolbox.AudioTimeStamp timestamp, uint frameCount, ref AudioBuffers outputData);
[EditorBrowsable (EditorBrowsableState.Never)]
public delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler2 (ref bool isSilence, ref AudioTimeStamp timestamp, uint frameCount, ref AudioBuffers outputData);
#endif // NET
#endif // XAMCORE_5_0

public partial class AVAudioSourceNode {
#if !XAMCORE_5_0 && NET
[EditorBrowsable (EditorBrowsableState.Never)]
[Obsolete ("Use the overload that takes a delegate that does not take a 'ref AudioBuffers' instead. Assigning a value to the 'inputData' parameter in the callback has no effect.")]
public AVAudioSourceNode (AVAudioSourceNodeRenderHandler renderHandler)
: this (GetHandler (renderHandler))
{
}

[EditorBrowsable (EditorBrowsableState.Never)]
[Obsolete ("Use the overload that takes a delegate that does not take a 'ref AudioBuffers' instead. Assigning a value to the 'inputData' parameter in the callback has no effect.")]
public AVAudioSourceNode (AVAudioFormat format, AVAudioSourceNodeRenderHandler renderHandler)
: this (format, GetHandler (renderHandler))
{
}
#endif // !XAMCORE_5_0

#if !NET
[EditorBrowsable (EditorBrowsableState.Never)]
[Obsolete ("Use the overload that takes a delegate that does not take a 'ref AudioBuffers' instead. Assigning a value to the 'inputData' parameter in the callback has no effect.")]
public AVAudioSourceNode (AVAudioSourceNodeRenderHandler2 renderHandler)
: this (GetHandler (renderHandler))
{
}

[EditorBrowsable (EditorBrowsableState.Never)]
[Obsolete ("Use the overload that takes a delegate that does not take a 'ref AudioBuffers' instead. Assigning a value to the 'inputData' parameter in the callback has no effect.")]
public AVAudioSourceNode (AVAudioFormat format, AVAudioSourceNodeRenderHandler2 renderHandler)
: this (format, GetHandler (renderHandler))
{
}
#endif // !NET

/// <summary>Creates an <see cref="T:AudioToolbox.AVAudioSourceNode" /> with the specified callback to render audio.</summary>
/// <param name="renderHandler">The callback that will be called to supply audio data.</param>
#if XAMCORE_5_0
public AVAudioSourceNode (AVAudioSourceNodeRenderHandler renderHandler)
#else
public AVAudioSourceNode (AVAudioSourceNodeRenderHandler3 renderHandler)
#endif // XAMCORE_5_0
: this (GetHandler (renderHandler))
{
}

/// <summary>Creates an <see cref="T:AudioToolbox.AVAudioSourceNode" /> with the specified callback to render audio.</summary>
/// <param name="format">The format of the PCM audio data the callback supplies.</param>
/// <param name="renderHandler">The callback that will be called to supply audio data.</param>
#if XAMCORE_5_0
public AVAudioSourceNode (AVAudioFormat format, AVAudioSourceNodeRenderHandler renderHandler)
#else
public AVAudioSourceNode (AVAudioFormat format, AVAudioSourceNodeRenderHandler3 renderHandler)
#endif // XAMCORE_5_0
: this (format, GetHandler (renderHandler))
{
}

#if !NET
static AVAudioSourceNodeRenderHandlerRaw GetHandler (AVAudioSourceNodeRenderHandler renderHandler)
{
AVAudioSourceNodeRenderHandlerRaw rv = (IntPtr isSilence, IntPtr timestamp, uint frameCount, IntPtr outputData) => {
unsafe {
byte* isSilencePtr = (byte*) isSilence;
bool isSilenceBool = (*isSilencePtr) != 0;
AudioTimeStamp timestampValue = *(AudioTimeStamp*) timestamp;
var buffers = new AudioBuffers (outputData);
return renderHandler (isSilenceBool, timestampValue, frameCount, ref buffers);
}
};
return rv;
}
#endif // !NET

#if !XAMCORE_5_0
#if NET
static AVAudioSourceNodeRenderHandlerRaw GetHandler (AVAudioSourceNodeRenderHandler renderHandler)
#else
static AVAudioSourceNodeRenderHandlerRaw GetHandler (AVAudioSourceNodeRenderHandler2 renderHandler)
#endif // NET
{
AVAudioSourceNodeRenderHandlerRaw rv = (IntPtr isSilence, IntPtr timestamp, uint frameCount, IntPtr outputData) => {
unsafe {
byte* isSilencePtr = (byte*) isSilence;
bool isSilenceBool = (*isSilencePtr) != 0;
var buffers = new AudioBuffers (outputData);
var rv = renderHandler (ref isSilenceBool, ref Unsafe.AsRef<AudioTimeStamp> ((void*) timestamp), frameCount, ref buffers);
*isSilencePtr = isSilenceBool.AsByte ();
return rv;
}
};
return rv;
}
#endif // !XAMCORE_5_0

#if XAMCORE_5_0
static AVAudioSourceNodeRenderHandlerRaw GetHandler (AVAudioSourceNodeRenderHandler renderHandler)
#else
static AVAudioSourceNodeRenderHandlerRaw GetHandler (AVAudioSourceNodeRenderHandler3 renderHandler)
{
#endif // !XAMCORE_5_0
AVAudioSourceNodeRenderHandlerRaw rv = (IntPtr isSilence, IntPtr timestamp, uint frameCount, IntPtr outputData) => {
unsafe {
byte* isSilencePtr = (byte*) isSilence;
bool isSilenceBool = (*isSilencePtr) != 0;
var buffers = new AudioBuffers (outputData);
var rv = renderHandler (ref isSilenceBool, ref Unsafe.AsRef<AudioTimeStamp> ((void*) timestamp), frameCount, buffers);
*isSilencePtr = isSilenceBool.AsByte ();
return rv;
}
};
return rv;
}
}
}
2 changes: 0 additions & 2 deletions src/AVFoundation/AVCompat.cs
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,6 @@
#nullable enable

namespace AVFoundation {
public delegate int AVAudioSourceNodeRenderHandler (bool isSilence, AudioToolbox.AudioTimeStamp timestamp, uint frameCount, ref AudioToolbox.AudioBuffers outputData);

partial class AVAudioNode {
internal AVAudioNode () { }
}
Expand Down
32 changes: 16 additions & 16 deletions src/avfoundation.cs
Original file line number Diff line number Diff line change
Expand Up @@ -15844,32 +15844,32 @@ interface AVCompositionTrackFormatDescriptionReplacement : NSSecureCoding {
[Export ("replacementFormatDescription")]
CMFormatDescription ReplacementFormatDescription { get; }
}
#if NET
delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler (ref bool isSilence, ref AudioTimeStamp timestamp, uint frameCount, ref AudioBuffers outputData);
#else
delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler (bool isSilence, AudioToolbox.AudioTimeStamp timestamp, uint frameCount, ref AudioBuffers outputData);
delegate /* OSStatus */ int AVAudioSourceNodeRenderHandler2 (ref bool isSilence, ref AudioTimeStamp timestamp, uint frameCount, ref AudioBuffers outputData);
#endif

/// <summary>The delegate that will be called in a callback from <see cref="T:AudioToolbox.AVAudioSourceNode" />.</summary>
/// <returns>An OSStatus result code. Return 0 to indicate success.</returns>
/// <param name="isSilence">Indicates whether the supplied audio data only contains silence. This is a pointer to a <see cref="T:System.Byte" /> value.</param>
/// <param name="timestamp">The timestamp the audio renders (HAL time). This is a pointer to an <see cref="T:AudioToolbox.AudioTimeStamp" /> value.</param>
/// <param name="frameCount">The number of frames of audio to supply.</param>
/// <param name="outputData">The <see cref="T:AudioToolbox.AudioBuffers" /> that contains the supplied audio data when the callback returns. This is a handle for an <see cref="T:AudioToolbox.AudioBuffers" /> value.</param>
delegate /* OSStatus */ int AVAudioSourceNodeRenderHandlerRaw (IntPtr isSilence, IntPtr timestamp, uint frameCount, IntPtr outputData);

[Watch (6, 0), TV (13, 0), iOS (13, 0)]
[MacCatalyst (13, 1)]
[BaseType (typeof (AVAudioNode))]
[DisableDefaultCtor]
interface AVAudioSourceNode : AVAudioMixing {
/// <summary>Creates an <see cref="T:AudioToolbox.AVAudioSourceNode" /> with the specified callback to render audio.</summary>
/// <param name="renderHandler">The callback that will be called to supply audio data.</param>
[Export ("initWithRenderBlock:")]
[DesignatedInitializer]
#if NET
NativeHandle Constructor (AVAudioSourceNodeRenderHandler renderHandler);
#else
NativeHandle Constructor (AVAudioSourceNodeRenderHandler2 renderHandler);
#endif
NativeHandle Constructor (AVAudioSourceNodeRenderHandlerRaw renderHandler);

/// <summary>Creates an <see cref="T:AudioToolbox.AVAudioSourceNode" /> with the specified callback to render audio.</summary>
/// <param name="format">The format of the PCM audio data the callback supplies.</param>
/// <param name="renderHandler">The callback that will be called to supply audio data.</param>
[Export ("initWithFormat:renderBlock:")]
[DesignatedInitializer]
#if NET
NativeHandle Constructor (AVAudioFormat format, AVAudioSourceNodeRenderHandler renderHandler);
#else
NativeHandle Constructor (AVAudioFormat format, AVAudioSourceNodeRenderHandler2 renderHandler);
#endif
NativeHandle Constructor (AVAudioFormat format, AVAudioSourceNodeRenderHandlerRaw renderHandler);
}

delegate int AVAudioSinkNodeReceiverHandlerRaw (IntPtr timestamp, uint frameCount, IntPtr inputData);
Expand Down
2 changes: 2 additions & 0 deletions src/bgen/Generator.cs
Original file line number Diff line number Diff line change
Expand Up @@ -4538,6 +4538,8 @@ group fullname by ns into g
if (shortName.StartsWith ("Func<", StringComparison.Ordinal))
continue;

WriteDocumentation (mi.DeclaringType);

var del = mi.DeclaringType;

if (AttributeManager.HasAttribute (mi.DeclaringType, "MonoNativeFunctionWrapper"))
Expand Down
1 change: 1 addition & 0 deletions src/frameworks.sources
Original file line number Diff line number Diff line change
Expand Up @@ -267,6 +267,7 @@ AVFOUNDATION_SOURCES = \
AVFoundation/AVAudioSessionDataSourceDescription.cs \
AVFoundation/AVAudioSessionPortDescription.cs \
AVFoundation/AVAudioSinkNode.cs \
AVFoundation/AVAudioSourceNode.cs \
AVFoundation/AVCaptureConnection.cs \
AVFoundation/AVCaptureDeviceDiscoverySession.cs \
AVFoundation/AVCaptureDeviceInput.cs \
Expand Down
8 changes: 8 additions & 0 deletions tests/cecil-tests/Documentation.KnownFailures.txt
Original file line number Diff line number Diff line change
Expand Up @@ -40122,6 +40122,14 @@ M:AVFoundation.AVAudioSourceNodeRenderHandler.#ctor(System.Object,System.IntPtr)
M:AVFoundation.AVAudioSourceNodeRenderHandler.BeginInvoke(System.Boolean@,AudioToolbox.AudioTimeStamp@,System.UInt32,AudioToolbox.AudioBuffers@,System.AsyncCallback,System.Object)
M:AVFoundation.AVAudioSourceNodeRenderHandler.EndInvoke(System.Boolean@,AudioToolbox.AudioTimeStamp@,AudioToolbox.AudioBuffers@,System.IAsyncResult)
M:AVFoundation.AVAudioSourceNodeRenderHandler.Invoke(System.Boolean@,AudioToolbox.AudioTimeStamp@,System.UInt32,AudioToolbox.AudioBuffers@)
M:AVFoundation.AVAudioSourceNodeRenderHandler3.#ctor(System.Object,System.IntPtr)
M:AVFoundation.AVAudioSourceNodeRenderHandler3.BeginInvoke(System.Boolean@,AudioToolbox.AudioTimeStamp@,System.UInt32,AudioToolbox.AudioBuffers,System.AsyncCallback,System.Object)
M:AVFoundation.AVAudioSourceNodeRenderHandler3.EndInvoke(System.Boolean@,AudioToolbox.AudioTimeStamp@,System.IAsyncResult)
M:AVFoundation.AVAudioSourceNodeRenderHandler3.Invoke(System.Boolean@,AudioToolbox.AudioTimeStamp@,System.UInt32,AudioToolbox.AudioBuffers)
M:AVFoundation.AVAudioSourceNodeRenderHandlerRaw.#ctor(System.Object,System.IntPtr)
M:AVFoundation.AVAudioSourceNodeRenderHandlerRaw.BeginInvoke(System.IntPtr,System.IntPtr,System.UInt32,System.IntPtr,System.AsyncCallback,System.Object)
M:AVFoundation.AVAudioSourceNodeRenderHandlerRaw.EndInvoke(System.IAsyncResult)
M:AVFoundation.AVAudioSourceNodeRenderHandlerRaw.Invoke(System.IntPtr,System.IntPtr,System.UInt32,System.IntPtr)
M:AVFoundation.AVAudioStereoMixing.#ctor
M:AVFoundation.AVAudioStereoMixing.#ctor(Foundation.NSObjectFlag)
M:AVFoundation.AVAudioStereoMixing.#ctor(ObjCRuntime.NativeHandle)
Expand Down
Loading
Loading