Reputation: 1526
I am using this code for extracting text from image, First time the code runs perfectly after that its start giving this error message [coreml] Failed to get the home directory when checking model path
. Here is the code that I am using to extract text form image. This is the same code that I have copied from that I have copied from apple documentation
func requestORC(image: UIImage) {
// guard let cgImage = UIImage(named: "test")?.cgImage else { return }
guard let cgImage = image.cgImage else { return }
// / Create a new image-request handler.
let requestHandler = VNImageRequestHandler(cgImage: cgImage)
// Create a new request to recognize text.
let request = VNRecognizeTextRequest(completionHandler: recognizeTextHandler)
do {
// Perform the text-recognition request.
try requestHandler.perform([request])
} catch {
print("Unable to perform the requests: \(error).")
}
}
func recognizeTextHandler(request: VNRequest, error: Error?) {
guard let observations =
request.results as? [VNRecognizedTextObservation] else {
return
}
let recognizedStrings = observations.compactMap { observation in
// Return the string of the top VNRecognizedText instance.
return observation.topCandidates(1).first?.string
}
// Process the recognized strings.
// print(recognizedStrings)
self.recognizedStrings = recognizedStrings
}
Upvotes: 7
Views: 1108
Reputation: 58553
It may sound very strange but the Failed to get the home directory when checking model path
error occurs in Xcode simulator only when you running a text recognition app with the default .accurate
case of a request's recognition level. Change the value to .fast
and the error will disappear.
It should also be said that when running the code on an actual device, the above error does not appear at all.
Here's the code:
import SwiftUI
import Vision
import CoreML
struct ContentView : View {
@State var recognizedStrings = [String]()
@State var interpolatedString = ""
var body: some View {
ZStack {
Color.yellow.ignoresSafeArea()
Text(interpolatedString)
.multilineTextAlignment(.center)
.font(.largeTitle)
.padding(.horizontal, 100)
}
.onAppear {
self.opticalCharacterRecognition(.init(named: "make.png")!)
if recognizedStrings.count > 0 {
for i in 0 ..< recognizedStrings.count {
interpolatedString += recognizedStrings[i] + " "
}
}
}
}
}
extension ContentView {
func opticalCharacterRecognition(_ image: UIImage) {
guard let pngData = image.pngData() else { return }
let requestHandler = VNImageRequestHandler(data: pngData)
let request = VNRecognizeTextRequest(completionHandler: textHandler)
#if targetEnvironment(simulator)
print("No more errors...")
request.recognitionLevel = .fast // Here it is
#endif
do {
try requestHandler.perform([request])
} catch {
print("Unable to perform the requests: \(error).")
}
}
func textHandler(request: VNRequest, error: Error?) {
guard let observations = request.results as? [VNRecognizedTextObservation]
else { return }
let recognizedStrings = observations.compactMap { observation in
return observation.topCandidates(1).first?.string
}
self.recognizedStrings = recognizedStrings
}
}
Upvotes: 0