Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

iOS LLM Task SDK is NOT initialized successfully via KMP #5601

Open
2BAB opened this issue Sep 2, 2024 · 2 comments
Open

iOS LLM Task SDK is NOT initialized successfully via KMP #5601

2BAB opened this issue Sep 2, 2024 · 2 comments
Assignees
Labels
platform:ios MediaPipe IOS issues stat:awaiting googler Waiting for Google Engineer's Response task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:support General questions

Comments

@2BAB
Copy link

2BAB commented Sep 2, 2024

Have I written custom code (as opposed to using a stock example script provided in MediaPipe)

Yes

OS Platform and Distribution

iOS 15

MediaPipe Tasks SDK version

0.10.14

Task name (e.g. Image classification, Gesture recognition etc.)

LLM Inference

Programming Language and version (e.g. C++, Python, Java)

Kotlin, Objective-C, Swift

Describe the actual behavior

It crashed (NPE) right after "Metal LLM tokens initialized" got printed.

Describe the expected behaviour

Should be initialized normally. Can add a new init method with error catching inside and returning nullable error string.

Standalone code/steps you may have used to try to get what you need

  1. Integrate iOS SDK into a Kotlin Multiplatform project using Cocoapods Gradle Plugin:
kotlin {    

    listOf(
        iosX64(),
        iosArm64(),
        iosSimulatorArm64()
    ).forEach {
        it.binaries.framework {
            baseName = "Mediapiper"
            isStatic = true
        }
        it.binaries.all {
            linkerOpts("-L/usr/lib/swift")
            linkerOpts("-rpath", "/usr/lib/swift")
            val aicPathSuffix = when (this.target.konanTarget) {
                KonanTarget.IOS_ARM64 -> "ios-arm64"
                KonanTarget.IOS_X64, KonanTarget.IOS_SIMULATOR_ARM64 -> "ios-arm64_x86_64-simulator"
                else -> null
            }
            aicPathSuffix?.let { p ->
                listOf(
                    "MediaPipeTasksGenAIC",
                    "MediaPipeTasksGenAI"
                ).forEach { f ->
                    linkerOpts("-framework", f, "-F../iosApp/Pods/$f/frameworks/$f.xcframework/$p")
                }
                val swiftPathSuffix = when (this.target.konanTarget) {
                    KonanTarget.IOS_ARM64 -> "iphoneos"
                    KonanTarget.IOS_X64, KonanTarget.IOS_SIMULATOR_ARM64 -> "iphonesimulator"
                    else -> null
                }
                swiftPathSuffix?.let { sp ->
                    val swiftPathPrefix =
                        "/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/lib"
                    linkerOpts("-L$swiftPathPrefix/swift/$sp")
                    linkerOpts("-rpath", "$swiftPathPrefix/swift-5.0/$sp")
                }
            }
        }
    }

    cocoapods {
        name = "Mediapiper"

        version = "1.0.1"
        ios.deploymentTarget = "15"

        summary = "Mediapiper"
        homepage = "https://github.com/2BAB/Mediapiper"

        pod("MediaPipeTasksGenAIC") {
            version = "0.10.14"
            extraOpts += listOf("-compiler-option", "-fmodules")
        }
        pod("MediaPipeTasksGenAI") {
            version = "0.10.14"
            extraOpts += listOf("-compiler-option", "-fmodules")
        }

    }
    
}
  1. Import the MPPLLMInference that generated by Kotlin Interop from step 1:
import cocoapods.MediaPipeTasksGenAI.MPPLLMInference
import cocoapods.MediaPipeTasksGenAI.MPPLLMInferenceOptions
  1. Try to initialize it (the same way as what official sample did):
    init {
        val modelPath = NSBundle.mainBundle.pathForResource("gemma-2b-it-gpu-int4", "bin")

        val options = MPPLLMInferenceOptions(modelPath!!)
        options.setModelPath(modelPath!!)
        options.setMaxTokens(1024)
        options.setTopk(40)
        options.setTemperature(0.8f)
        options.setRandomSeed(102)
        // By default it's nullable
        inference = MPPLLMInference(options, null) // NPE(NullPointerException) throws here! Right after a SDK internal log "Metal LLM tokens initialized" printed.
        // Or in another way, trying to pass the error handler
        memScoped {
          var error = alloc<ObjCObjectVar<NSError?>>()
          val inference = MPPLLMInference(options, error.ptr) // The same NPE was thrown.
        }
    }

This may come from the limitation of KMP interop (https://kotlinlang.org/docs/native-objc-interop.html#errors-and-exceptions), since the constructor signature is @objc public init(options: Options) throws {...}.

@objc public init(options: Options) throws {

A full example for this issue can be found from below:

https://github.com/2BAB/MediaPiper/blob/ios-sdk-init-crash/app/src/iosMain/kotlin/me.xx2bab.mediapiper/llm/LLMOperator.ios.kt#L50

Other info / Complete Logs

No response

@kuaashish kuaashish added platform:ios MediaPipe IOS issues task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:support General questions labels Sep 2, 2024
@kuaashish
Copy link
Collaborator

Hi @2BAB,

Could you please provide the complete standalone code or the specific steps you are following from our documentation to help us better understand the issue? If needed, we will reproduce it on our end.

Thank you!!

@kuaashish kuaashish added the stat:awaiting response Waiting for user response label Sep 2, 2024
@google-ml-butler google-ml-butler bot removed the stat:awaiting response Waiting for user response label Sep 2, 2024
@kuaashish kuaashish added the stat:awaiting response Waiting for user response label Sep 3, 2024
@2BAB
Copy link
Author

2BAB commented Sep 4, 2024

@kuaashish updated the standalone code snippet and bolded the full example link.

@google-ml-butler google-ml-butler bot removed the stat:awaiting response Waiting for user response label Sep 4, 2024
@kuaashish kuaashish added the stat:awaiting googler Waiting for Google Engineer's Response label Sep 24, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
platform:ios MediaPipe IOS issues stat:awaiting googler Waiting for Google Engineer's Response task:LLM inference Issues related to MediaPipe LLM Inference Gen AI setup type:support General questions
Projects
None yet
Development

No branches or pull requests

4 participants
@2BAB @schmidt-sebastian @kuaashish and others