We can try with the srcnn model from sr filter.
1) get srcnn.pb model file, see filter sr
2) convert srcnn.pb into openvino model with command:
python mo_tf.py --input_model srcnn.pb --data_type=FP32 --input_shape 
[1,960,1440,1] --keep_shape_ops

See the script at 
https://github.com/openvinotoolkit/openvino/tree/master/model-optimizer
We'll see srcnn.xml and srcnn.bin at current path, copy them to the
directory where ffmpeg is.

I have also uploaded the model files at 
https://github.com/guoyejun/dnn_processing/tree/master/models

3) run with openvino backend:
ffmpeg -i input.jpg -vf 
format=yuv420p,scale=w=iw*2:h=ih*2,dnn_processing=dnn_backend=openvino:model=srcnn.xml:input=x:output=srcnn/Maximum
 -y srcnn.ov.jpg
(The input.jpg resolution is 720*480)

Signed-off-by: Guo, Yejun <yejun....@intel.com>
---
 doc/filters.texi                | 10 +++++++++-
 libavfilter/vf_dnn_processing.c |  5 ++++-
 2 files changed, 13 insertions(+), 2 deletions(-)

diff --git a/doc/filters.texi b/doc/filters.texi
index 84567de..d197d33 100644
--- a/doc/filters.texi
+++ b/doc/filters.texi
@@ -9288,13 +9288,21 @@ TensorFlow backend. To enable this backend you
 need to install the TensorFlow for C library (see
 @url{https://www.tensorflow.org/install/install_c}) and configure FFmpeg with
 @code{--enable-libtensorflow}
+
+@item openvino
+OpenVINO backend. To enable this backend you
+need to build and install the OpenVINO for C library (see
+@url{https://github.com/openvinotoolkit/openvino/blob/master/build-instruction.md})
 and configure FFmpeg with
+@code{--enable-libopenvino} (--extra-cflags=-I... --extra-ldflags=-L... might
+be needed if the header files and libraries are not installed into system path)
+
 @end table
 
 Default value is @samp{native}.
 
 @item model
 Set path to model file specifying network architecture and its parameters.
-Note that different backends use different file formats. TensorFlow and native
+Note that different backends use different file formats. TensorFlow, OpenVINO 
and native
 backend can load files for only its format.
 
 Native model file (.model) can be generated from TensorFlow model file (.pb) 
by using tools/python/convert.py
diff --git a/libavfilter/vf_dnn_processing.c b/libavfilter/vf_dnn_processing.c
index cf589ac..4b31808 100644
--- a/libavfilter/vf_dnn_processing.c
+++ b/libavfilter/vf_dnn_processing.c
@@ -58,11 +58,14 @@ typedef struct DnnProcessingContext {
 #define OFFSET(x) offsetof(DnnProcessingContext, x)
 #define FLAGS AV_OPT_FLAG_FILTERING_PARAM | AV_OPT_FLAG_VIDEO_PARAM
 static const AVOption dnn_processing_options[] = {
-    { "dnn_backend", "DNN backend",                OFFSET(backend_type),     
AV_OPT_TYPE_INT,       { .i64 = 0 },    0, 1, FLAGS, "backend" },
+    { "dnn_backend", "DNN backend",                OFFSET(backend_type),     
AV_OPT_TYPE_INT,       { .i64 = 0 },    INT_MIN, INT_MAX, FLAGS, "backend" },
     { "native",      "native backend flag",        0,                        
AV_OPT_TYPE_CONST,     { .i64 = 0 },    0, 0, FLAGS, "backend" },
 #if (CONFIG_LIBTENSORFLOW == 1)
     { "tensorflow",  "tensorflow backend flag",    0,                        
AV_OPT_TYPE_CONST,     { .i64 = 1 },    0, 0, FLAGS, "backend" },
 #endif
+#if (CONFIG_LIBOPENVINO == 1)
+    { "openvino",    "openvino backend flag",      0,                        
AV_OPT_TYPE_CONST,     { .i64 = 2 },    0, 0, FLAGS, "backend" },
+#endif
     { "model",       "path to model file",         OFFSET(model_filename),   
AV_OPT_TYPE_STRING,    { .str = NULL }, 0, 0, FLAGS },
     { "input",       "input name of the model",    OFFSET(model_inputname),  
AV_OPT_TYPE_STRING,    { .str = NULL }, 0, 0, FLAGS },
     { "output",      "output name of the model",   OFFSET(model_outputname), 
AV_OPT_TYPE_STRING,    { .str = NULL }, 0, 0, FLAGS },
-- 
2.7.4

_______________________________________________
ffmpeg-devel mailing list
ffmpeg-devel@ffmpeg.org
https://ffmpeg.org/mailman/listinfo/ffmpeg-devel

To unsubscribe, visit link above, or email
ffmpeg-devel-requ...@ffmpeg.org with subject "unsubscribe".

Reply via email to