Tomer Schlesinger commited on
Commit
5717d4a
·
unverified ·
1 Parent(s): a28f35e

docs : update README.md for whisper.objc app (#2569)

Browse files
Files changed (1) hide show
  1. examples/whisper.objc/README.md +14 -7
examples/whisper.objc/README.md CHANGED
@@ -26,10 +26,17 @@ If you don't want to convert a Core ML model, you can skip this step by creating
26
  mkdir models/ggml-base.en-encoder.mlmodelc
27
  ```
28
 
29
- ## Core ML
30
-
31
- Follow the [`Core ML support` section of readme](../../README.md#core-ml-support) to convert the model.
32
- That is all the needs to be done to use the Core ML model in the app. The converted model is a
33
- resource in the project and will be used if it is available. Note that the Core ML model is only
34
- used for the encoder, the decoder which is in the ggml model is still required so both need to
35
- be available.
 
 
 
 
 
 
 
 
26
  mkdir models/ggml-base.en-encoder.mlmodelc
27
  ```
28
 
29
+ ### Core ML support
30
+ 1. Follow all the steps in the `Usage` section, including adding the ggml model file.
31
+ The ggml model file is required as the Core ML model is only used for the encoder. The
32
+ decoder which is in the ggml model is still required.
33
+ 2. Follow the [`Core ML support` section of readme](../../README.md#core-ml-support) to convert the
34
+ model.
35
+ 3. Add the Core ML model (`models/ggml-base.en-encoder.mlmodelc/`) to `whisper.swiftui.demo/Resources/models` **via Xcode**.
36
+
37
+ When the example starts running you should now see that it is using the Core ML model:
38
+ ```console
39
+ whisper_init_state: loading Core ML model from '/Library/Developer/CoreSimulator/Devices/25E8C27D-0253-4281-AF17-C3F2A4D1D8F4/data/Containers/Bundle/Application/3ADA7D59-7B9C-43B4-A7E1-A87183FC546A/whisper.swiftui.app/models/ggml-base.en-encoder.mlmodelc'
40
+ whisper_init_state: first run on a device may take a while ...
41
+ whisper_init_state: Core ML model loaded
42
+ ```