I am developing a Flutter application and integrating the Azure Speech SDK for keyword recognition on iOS. I'm working with a mixed Swift/Objective-C codebase and am consistently encountering a Swift compiler error, despite extensive troubleshooting.
The Core Problem:
My Swift AppDelegate.swift
class cannot access a method (setFlutterChannel
) that is explicitly declared and implemented in my custom Objective-C class (KeywordRecognizer
). So basically, the app will recognize the keyword/wake word, but then it doesn't trigger the next action I guess because it's not bridging correctly to the swift code.
The Specific Error Message (consistently):
Swift Compiler Error (Xcode): Value of type 'KeywordRecognizer' has no member 'setFlutterChannel'
/Users/kimberlycoston/ShelfSenseSolo/ios/Runner/AppDelegate.swift:20:27
(Note: The exact line number might vary slightly based on file edits, but it always points to the call keywordRecognizer?.setFlutterChannel(keywordChannel)
).
My Setup:
- iOS Native Code: (files linked here: https://drive.google.com/drive/folders/1jiBiHnrDFqn6gg6WOpXaf0BneUZL9eNd?usp=sharing)
-
__KeywordRecognizer.h__
(Objective-C header): Declares setFlutterChannel:(FlutterMethodChannel *)channel;
and @property (strong, nonatomic, nullable) FlutterMethodChannel *flutterChannel;
. It imports Foundation.h
, SPXSpeechApi.h
, and Flutter.h
.
-
__KeywordRecognizer.m__
(Objective-C implementation): Implements setFlutterChannel:
and uses self.flutterChannel
for callbacks (e.g., [strongSelf.flutterChannel invokeMethod:@"onWakeWordDetected" arguments:nil];
).
-
__Runner-Bridging-Header.h__
: Imports GeneratedPluginRegistrant.h
, SPXSpeechApi.h
, KeywordRecognizer.h
, and Flutter.h
.
-
__AppDelegate.swift__
: Instantiates KeywordRecognizer
and attempts to call setFlutterChannel
on it. It uses @objc class AppDelegate: FlutterAppDelegate
.
- Azure Speech SDK Integration:
- The
MicrosoftCognitiveServicesSpeech.xcframework
(the iOS embedded binary) is manually added to the Xcode project's "Frameworks, Libraries, and Embedded Content" section, with "Embed & Sign" selected.
- The
Podfile
has been cleared of any MicrosoftCognitiveServicesSpeech-iOS
or MicrosoftCognitiveServicesSpeechEmbedded-iOS
entries to prevent conflicts. pod install
has been run after Podfile
changes.
Troubleshooting Steps Taken (without success, leading to this forum post):
1. Confirmed method declaration/implementation: Verified setFlutterChannel
is correctly in KeywordRecognizer.h
and implemented in KeywordRecognizer.m
.
2. Bridging Header: Confirmed Runner-Bridging-Header.h
explicitly imports KeywordRecognizer.h
.
3. Xcode Build Settings:
- "Objective-C Bridging Header" is set correctly to
Runner/Runner-Bridging-Header.h
for all build configurations.
- "Defines Module" is set to "Yes" for the
Runner
target.
- "Product Module Name" is consistent across all build configurations (
Runner
).
4. File Inclusion/Compilation:
-
KeywordRecognizer.m
is explicitly listed and enabled in "Build Phases" -> "Compile Sources" for the Runner
target.
-
AppDelegate.swift
is also in "Compile Sources".
5. Clean Builds: Performed numerous "Product > Clean Build Folder" operations and manually deleted Xcode's Derived Data (~/Library/Developer/Xcode/DerivedData
).
6. Framework Management: Ensured the Azure SDK .xcframework
is the only method of integration for the Speech SDK (no lingering CocoaPods entries).
7. File Re-adding: As a last resort, removed KeywordRecognizer.h
and KeywordRecognizer.m
from the Xcode project (removed reference), cleaned Derived Data, and then re-added them.
Despite all these extensive efforts, the Swift compiler consistently reports that KeywordRecognizer
has no setFlutterChannel
member. It's as if the Swift compiler cannot properly resolve the Objective-C class's full interface from the bridging header.
ANY guidance or suggestions would be greatly appreciated!! Thank you!