Persistent Swift Compiler Error: 'Value of type 'ObjectiveCClass' has no member 'method' in Flutter iOS App with Azure Speech SDK

Kimberly Coston 0 Reputation points
2025-06-17T15:22:06.2466667+00:00

I am developing a Flutter application and integrating the Azure Speech SDK for keyword recognition on iOS. I'm working with a mixed Swift/Objective-C codebase and am consistently encountering a Swift compiler error, despite extensive troubleshooting.

The Core Problem:

My Swift AppDelegate.swift class cannot access a method (setFlutterChannel) that is explicitly declared and implemented in my custom Objective-C class (KeywordRecognizer). So basically, the app will recognize the keyword/wake word, but then it doesn't trigger the next action I guess because it's not bridging correctly to the swift code.

The Specific Error Message (consistently):


Swift Compiler Error (Xcode): Value of type 'KeywordRecognizer' has no member 'setFlutterChannel'

/Users/kimberlycoston/ShelfSenseSolo/ios/Runner/AppDelegate.swift:20:27

(Note: The exact line number might vary slightly based on file edits, but it always points to the call keywordRecognizer?.setFlutterChannel(keywordChannel)).

My Setup:

  • iOS Native Code: (files linked here: https://drive.google.com/drive/folders/1jiBiHnrDFqn6gg6WOpXaf0BneUZL9eNd?usp=sharing)
    • __KeywordRecognizer.h__ (Objective-C header): Declares setFlutterChannel:(FlutterMethodChannel *)channel; and @property (strong, nonatomic, nullable) FlutterMethodChannel *flutterChannel;. It imports Foundation.h, SPXSpeechApi.h, and Flutter.h.
    • __KeywordRecognizer.m__ (Objective-C implementation): Implements setFlutterChannel: and uses self.flutterChannel for callbacks (e.g., [strongSelf.flutterChannel invokeMethod:@"onWakeWordDetected" arguments:nil];).
    • __Runner-Bridging-Header.h__: Imports GeneratedPluginRegistrant.h, SPXSpeechApi.h, KeywordRecognizer.h, and Flutter.h.
    • __AppDelegate.swift__: Instantiates KeywordRecognizer and attempts to call setFlutterChannel on it. It uses @objc class AppDelegate: FlutterAppDelegate.
  • Azure Speech SDK Integration:
    • The MicrosoftCognitiveServicesSpeech.xcframework (the iOS embedded binary) is manually added to the Xcode project's "Frameworks, Libraries, and Embedded Content" section, with "Embed & Sign" selected.
    • The Podfile has been cleared of any MicrosoftCognitiveServicesSpeech-iOS or MicrosoftCognitiveServicesSpeechEmbedded-iOS entries to prevent conflicts. pod install has been run after Podfile changes.

Troubleshooting Steps Taken (without success, leading to this forum post):

1.  Confirmed method declaration/implementation: Verified setFlutterChannel is correctly in KeywordRecognizer.h and implemented in KeywordRecognizer.m.

2.  Bridging Header: Confirmed Runner-Bridging-Header.h explicitly imports KeywordRecognizer.h.

3.  Xcode Build Settings:

  • "Objective-C Bridging Header" is set correctly to Runner/Runner-Bridging-Header.h for all build configurations.
  • "Defines Module" is set to "Yes" for the Runner target.
  • "Product Module Name" is consistent across all build configurations (Runner).

4.  File Inclusion/Compilation:

  • KeywordRecognizer.m is explicitly listed and enabled in "Build Phases" -> "Compile Sources" for the Runner target.
  • AppDelegate.swift is also in "Compile Sources".

5.  Clean Builds: Performed numerous "Product > Clean Build Folder" operations and manually deleted Xcode's Derived Data (~/Library/Developer/Xcode/DerivedData).

6.  Framework Management: Ensured the Azure SDK .xcframework is the only method of integration for the Speech SDK (no lingering CocoaPods entries).

7.  File Re-adding: As a last resort, removed KeywordRecognizer.h and KeywordRecognizer.m from the Xcode project (removed reference), cleaned Derived Data, and then re-added them.

Despite all these extensive efforts, the Swift compiler consistently reports that KeywordRecognizer has no setFlutterChannel member. It's as if the Swift compiler cannot properly resolve the Objective-C class's full interface from the bridging header.

ANY guidance or suggestions would be greatly appreciated!! Thank you!

Azure AI Speech
Azure AI Speech
An Azure service that integrates speech processing into apps and services.
2,061 questions
Azure
Azure
A cloud computing platform and infrastructure for building, deploying and managing applications and services through a worldwide network of Microsoft-managed datacenters.
1,409 questions
0 comments No comments
{count} votes

1 answer

Sort by: Most helpful
  1. Amira Bedhiafi 33,071 Reputation points Volunteer Moderator
    2025-06-22T16:43:56.4966667+00:00

    Hello Kimberly !

    Thank you for posting on Microsoft Learn.

    Swift sees your class as a generic @objc type or only sees the interface declared in a module map but not the specific method.

    The KeywordRecognizer.h uses @interface KeywordRecognizer : NSObject if it's declared like this, you're good.

    If you're using a @class forward declaration for KeywordRecognizer somewhere instead of an import, Swift won't know the full class layout.

    Remove @import KeywordRecognizer or any use of a compiled module that doesn’t expose the full interface.

    If your project accidentally compiled KeywordRecognizer into a static library or dynamic framework, Swift will see only what’s in the module map usually not your custom methods like setFlutterChannel.

    Go to your Xcode project > Build Settings (for Runner target), and verify the following:

    Objective-C Bridging Header: Runner/Runner-Bridging-Header.h

    Defines Module: YES

    Always Embed Swift Standard Libraries: YES

    Also, check that KeywordRecognizer.h is explicitly listed in your bridging header.

    0 comments No comments

Your answer

Answers can be marked as Accepted Answers by the question author, which helps users to know the answer solved the author's problem.