James GUO
James GUO

Reputation: 1

train the mediapipe hand landmark model and use it in iphone app

I want to add a new gesture to the hand landmark model of mediapipe and use it in my iphone app. Here is what I am doing:

  1. train the hand landmark model by my own data, create keras model
  2. convert the keras model to float16 tflite model, add the meta data which copied from the tflite model contained in the https://github.com/google-ai-edge/mediapipe-samples/blob/main/examples/hand_landmarker/ios/HandLandmarker/hand_landmarker.task, then rename the float16 tflite model as hand_landmarks_detector.tflite
  3. package the hand_landmarks_detector.tflite and the hand_detector.tflite (which is contained in https://github.com/google-ai-edge/mediapipe-samples/blob/main/examples/hand_landmarker/ios/HandLandmarker/hand_landmarker.task) into a new task file.
  4. replace the old hand_landmark.task with the new one and run the iphone app.

But I get error:

INVALID_ARGUMENT: Invalid metadata schema version: expected M001, got {

The MidiaPipeTasksVision I am using: pod 'MediaPipeTasksVision', '0.10.14'

How to fix the prob.

Thanks,

James

I try to inspect the original hand_landmark.task file in the iphone app, and the task constains 2 tflite model, hand_detector and hand_landmarks_detector. There is no meta and handedness.txt.

Upvotes: 0

Views: 7

Answers (0)

Related Questions