Remove all openvino related recipes, tests , documentation and other
data from meta-intel as a new layer specific to openvino has been
created.

meta-openvino layer URL:
https://github.com/intel/meta-openvino

Signed-off-by: Yogesh Tyagi <[email protected]>
---
 conf/include/maintainers.inc                  |   3 -
 documentation/openvino.md                     |  95 ------------
 .../dldt/openvino-model-optimizer_2024.1.0.bb |  33 ----
 ...specific-tweaks-to-the-build-process.patch |  86 -----------
 ...2-cmake-Fix-overloaded-virtual-error.patch |  33 ----
 ...obuf-allow-target-protoc-to-be-built.patch |  45 ------
 .../open-model-zoo/0001-use-oe-gflags.patch   |  27 ----
 .../opencv/open-model-zoo_2024.1.0.bb         |  54 -------
 .../openvino-inference-engine_2024.1.0.bb     | 146 ------------------
 .../runtime/cases/dldt_inference_engine.py    | 109 -------------
 .../runtime/cases/dldt_model_optimizer.py     |  38 -----
 .../classification_sample.py                  | 135 ----------------
 lib/oeqa/runtime/miutils/dldtutils.py         |   3 -
 .../tests/dldt_inference_engine_test.py       |  56 -------
 .../tests/dldt_model_optimizer_test.py        |  23 ---
 .../tests/squeezenet_model_download_test.py   |  25 ---
 16 files changed, 911 deletions(-)
 delete mode 100644 documentation/openvino.md
 delete mode 100644 
dynamic-layers/meta-python/recipes-opencv/dldt/openvino-model-optimizer_2024.1.0.bb
 delete mode 100644 
dynamic-layers/openembedded-layer/recipes-support/opencv/files/0001-cmake-yocto-specific-tweaks-to-the-build-process.patch
 delete mode 100644 
dynamic-layers/openembedded-layer/recipes-support/opencv/files/0002-cmake-Fix-overloaded-virtual-error.patch
 delete mode 100644 
dynamic-layers/openembedded-layer/recipes-support/opencv/files/0003-protobuf-allow-target-protoc-to-be-built.patch
 delete mode 100644 
dynamic-layers/openembedded-layer/recipes-support/opencv/open-model-zoo/0001-use-oe-gflags.patch
 delete mode 100644 
dynamic-layers/openembedded-layer/recipes-support/opencv/open-model-zoo_2024.1.0.bb
 delete mode 100644 
dynamic-layers/openembedded-layer/recipes-support/opencv/openvino-inference-engine_2024.1.0.bb
 delete mode 100644 lib/oeqa/runtime/cases/dldt_inference_engine.py
 delete mode 100644 lib/oeqa/runtime/cases/dldt_model_optimizer.py
 delete mode 100644 
lib/oeqa/runtime/files/dldt-inference-engine/classification_sample.py
 delete mode 100644 lib/oeqa/runtime/miutils/dldtutils.py
 delete mode 100644 lib/oeqa/runtime/miutils/tests/dldt_inference_engine_test.py
 delete mode 100644 lib/oeqa/runtime/miutils/tests/dldt_model_optimizer_test.py
 delete mode 100644 
lib/oeqa/runtime/miutils/tests/squeezenet_model_download_test.py

diff --git a/conf/include/maintainers.inc b/conf/include/maintainers.inc
index 9849d0d8..da90c813 100644
--- a/conf/include/maintainers.inc
+++ b/conf/include/maintainers.inc
@@ -39,10 +39,7 @@ RECIPE_MAINTAINER:pn-onednn = "Naveen Saini 
<[email protected]>"
 RECIPE_MAINTAINER:pn-onedpl = "Naveen Saini <[email protected]>"
 RECIPE_MAINTAINER:pn-onevpl = "Naveen Saini <[email protected]>"
 RECIPE_MAINTAINER:pn-onevpl-intel-gpu = "Yew Chang Ching 
<[email protected]>"
-RECIPE_MAINTAINER:pn-open-model-zoo = "Anuj Mittal <[email protected]>"
 RECIPE_MAINTAINER:pn-opencl-clang = "Naveen Saini 
<[email protected]>"
-RECIPE_MAINTAINER:pn-openvino-inference-engine = "Anuj Mittal 
<[email protected]>"
-RECIPE_MAINTAINER:pn-openvino-model-optimizer = "Anuj Mittal 
<[email protected]>"
 RECIPE_MAINTAINER:pn-openvkl = "Naveen Saini <[email protected]>"
 RECIPE_MAINTAINER:pn-ospray = "Naveen Saini <[email protected]>"
 RECIPE_MAINTAINER:pn-ovmf-shell-image-enrollkeys = "Naveen Saini 
<[email protected]>"
diff --git a/documentation/openvino.md b/documentation/openvino.md
deleted file mode 100644
index 50dc680d..00000000
--- a/documentation/openvino.md
+++ /dev/null
@@ -1,95 +0,0 @@
-Build a Yocto Image with OpenVINO™ toolkit
-==========================================
-
-Follow the [Yocto Project official 
documentation](https://docs.yoctoproject.org/brief-yoctoprojectqs/index.html#compatible-linux-distribution)
 to set up and configure your host machine to be compatible with BitBake.
-
-## Step 1: Set Up Environment
-
-1. Clone the repositories.
-
-```
-      git clone https://git.yoctoproject.org/git/poky
-      git clone https://github.com/openembedded/meta-openembedded
-      git clone https://git.yoctoproject.org/git/meta-intel
-```
-
-
-2. Set up the OpenEmbedded build environment.
-
-```
-      source poky/oe-init-build-env
-
-```
-
-
-
-3. Add BitBake layers.
-
-
-```
-      bitbake-layers add-layer ../meta-openembedded/meta-oe
-      bitbake-layers add-layer ../meta-openembedded/meta-python
-      bitbake-layers add-layer ../meta-intel
-
-```
-
-
-4. Set up BitBake configurations.
-   Include extra configuration in the `conf/local.conf` file in your build 
directory as required.
-
-
-```
-      MACHINE = "intel-skylake-64"
-
-      # Enable building OpenVINO Python API.
-      # This requires meta-python layer to be included in bblayers.conf.
-      PACKAGECONFIG:append:pn-openvino-inference-engine = " python3"
-
-      # This adds OpenVINO related libraries in the target image.
-      CORE_IMAGE_EXTRA_INSTALL:append = " openvino-inference-engine"
-
-      # This adds OpenVINO samples in the target image.
-      CORE_IMAGE_EXTRA_INSTALL:append = " openvino-inference-engine-samples"
-
-      # Include OpenVINO Python API package in the target image.
-      CORE_IMAGE_EXTRA_INSTALL:append = " openvino-inference-engine-python3"
-
-      # Include model conversion API in the target image.
-      CORE_IMAGE_EXTRA_INSTALL:append = " openvino-model-optimizer"
-
-```
-
-## Step 2: Build a Yocto Image with OpenVINO Packages
-
-Run BitBake to build your image with OpenVINO packages. For example, to build 
the minimal image, run the following command:
-
-
-```
-   bitbake core-image-minimal
-
-```
-
-## Step 3: Verify the Yocto Image
-
-Verify that OpenVINO packages were built successfully. Run the following 
command:
-
-```
-   oe-pkgdata-util list-pkgs | grep openvino
-
-```
-
-
-If the image build is successful, it will return the list of packages as below:
-
-```
-   openvino-inference-engine
-   openvino-inference-engine-dbg
-   openvino-inference-engine-dev
-   openvino-inference-engine-python3
-   openvino-inference-engine-samples
-   openvino-inference-engine-src
-   openvino-model-optimizer
-   openvino-model-optimizer-dbg
-   openvino-model-optimizer-dev
-
-```
diff --git 
a/dynamic-layers/meta-python/recipes-opencv/dldt/openvino-model-optimizer_2024.1.0.bb
 
b/dynamic-layers/meta-python/recipes-opencv/dldt/openvino-model-optimizer_2024.1.0.bb
deleted file mode 100644
index de765d6c..00000000
--- 
a/dynamic-layers/meta-python/recipes-opencv/dldt/openvino-model-optimizer_2024.1.0.bb
+++ /dev/null
@@ -1,33 +0,0 @@
-SUMMARY = "OpenVINO Model Optimzer"
-DESCRIPTION = "Model Optimizer is a cross-platform command-line tool that \
-facilitates the transition between the training and deployment \
-environment, performs static model analysis, and adjusts deep \
-learning models for optimal execution on end-point target devices."
-HOMEPAGE = "https://01.org/openvinotoolkit";
-
-SRC_URI = 
"git://github.com/openvinotoolkit/openvino.git;protocol=https;branch=releases/2024/1;lfs=0
 \
-           "
-SRCREV = "f4afc983258bcb2592d999ed6700043fdb58ad78"
-
-LICENSE = "Apache-2.0"
-LIC_FILES_CHKSUM = "file://LICENSE;md5=86d3f3a95c324c9479bd8986968f4327"
-
-CVE_PRODUCT = "intel:openvino"
-S = "${WORKDIR}/git"
-
-inherit setuptools3
-
-SETUPTOOLS_SETUP_PATH = "${WORKDIR}/git/tools/mo"
-
-RDEPENDS:${PN} += " \
-                    python3-defusedxml \
-                    python3-fastjsonschema \
-                    python3-networkx \
-                    python3-numpy \
-                    python3-protobuf \
-                    python3-requests \
-                    python3-urllib3 \
-                    bash \
-                    "
-
-UPSTREAM_CHECK_GITTAGREGEX = "(?P<pver>(\d+\.\d+\.\d+))$"
diff --git 
a/dynamic-layers/openembedded-layer/recipes-support/opencv/files/0001-cmake-yocto-specific-tweaks-to-the-build-process.patch
 
b/dynamic-layers/openembedded-layer/recipes-support/opencv/files/0001-cmake-yocto-specific-tweaks-to-the-build-process.patch
deleted file mode 100644
index 7f5b46c6..00000000
--- 
a/dynamic-layers/openembedded-layer/recipes-support/opencv/files/0001-cmake-yocto-specific-tweaks-to-the-build-process.patch
+++ /dev/null
@@ -1,86 +0,0 @@
-From e4edbdae9a2dbfec6fd0706bdfff8abdfe3363fc Mon Sep 17 00:00:00 2001
-From: Anuj Mittal <[email protected]>
-Date: Wed, 29 Nov 2023 12:42:57 +0530
-Subject: [PATCH] cmake: yocto specific tweaks to the build process
-
-* Dont try to detect glibc version as that doesn't work when cross compiling.
-* Dont try to detect CXX11_ABI
-* Install sample binaries as well.
-* Dont try to write triggers for CPack. We package ourselves.
-* Fix the installation path for Python modules when baselib = lib64.
-
-Upstream-Status: Inappropriate
-
-Signed-off-by: Anuj Mittal <[email protected]>
----
- cmake/developer_package/packaging/rpm/rpm.cmake | 2 +-
- cmake/developer_package/target_flags.cmake      | 4 ++--
- samples/cpp/CMakeLists.txt                      | 6 +++---
- src/bindings/python/CMakeLists.txt              | 2 +-
- 4 files changed, 7 insertions(+), 7 deletions(-)
-
-diff --git a/cmake/developer_package/packaging/rpm/rpm.cmake 
b/cmake/developer_package/packaging/rpm/rpm.cmake
-index 99f11730983..1a1f61fcd3d 100644
---- a/cmake/developer_package/packaging/rpm/rpm.cmake
-+++ b/cmake/developer_package/packaging/rpm/rpm.cmake
-@@ -156,7 +156,7 @@ ov_rpm_specific_settings()
- # needed to add triggers for packages with libraries
- set(def_triggers "${OpenVINO_BINARY_DIR}/_CPack_Packages/triggers")
- set(triggers_content "# /bin/sh -p\n/sbin/ldconfig\n")
--file(WRITE "${def_triggers}" "${triggers_content}")
-+#file(WRITE "${def_triggers}" "${triggers_content}")
- 
- #
- # Functions helpful for packaging your modules with RPM cpack
-diff --git a/cmake/developer_package/target_flags.cmake 
b/cmake/developer_package/target_flags.cmake
-index d047a1aebd9..4e8ca68c60f 100644
---- a/cmake/developer_package/target_flags.cmake
-+++ b/cmake/developer_package/target_flags.cmake
-@@ -149,7 +149,7 @@ function(ov_glibc_version)
-     endif()
- endfunction()
- 
--ov_glibc_version()
-+#ov_glibc_version()
- 
- #
- # Detects default value for _GLIBCXX_USE_CXX11_ABI for current compiler
-@@ -160,4 +160,4 @@ macro(ov_get_glibcxx_use_cxx11_abi)
-     endif()
- endmacro()
- 
--ov_get_glibcxx_use_cxx11_abi()
-+#ov_get_glibcxx_use_cxx11_abi()
-diff --git a/samples/cpp/CMakeLists.txt b/samples/cpp/CMakeLists.txt
-index 4d33bff944e..3e7f1458578 100644
---- a/samples/cpp/CMakeLists.txt
-+++ b/samples/cpp/CMakeLists.txt
-@@ -206,9 +206,9 @@ macro(ov_add_sample)
-     target_link_libraries(${SAMPLE_NAME} PRIVATE ${ov_link_libraries} 
Threads::Threads ${SAMPLE_DEPENDENCIES})
- 
-     install(TARGETS ${SAMPLE_NAME}
--            RUNTIME DESTINATION samples_bin/
--            COMPONENT samples_bin
--            EXCLUDE_FROM_ALL)
-+            DESTINATION ${CMAKE_INSTALL_BINDIR}
-+            COMPONENT samples_bin)
-+
- 
-     # create global target with all samples / demo apps
-     if(NOT TARGET ov_samples)
-diff --git a/src/bindings/python/CMakeLists.txt 
b/src/bindings/python/CMakeLists.txt
-index 6cf43ec3fed..d539b9d003f 100644
---- a/src/bindings/python/CMakeLists.txt
-+++ b/src/bindings/python/CMakeLists.txt
-@@ -320,7 +320,7 @@ if(ENABLE_PYTHON_PACKAGING)
-     # install OpenVINO Python API
- 
-     set(python_package_prefix 
"${CMAKE_CURRENT_BINARY_DIR}/install_${pyversion}")
--    set(install_lib 
"${python_package_prefix}/lib/${python_versioned_folder}/${ov_site_packages}")
-+    set(install_lib 
"${python_package_prefix}/${CMAKE_INSTALL_LIBDIR}/${python_versioned_folder}/${ov_site_packages}")
-     set(openvino_meta_info_subdir 
"openvino-${OpenVINO_VERSION}-py${python_xy}.egg-info")
-     set(openvino_meta_info_file 
"${install_lib}/${openvino_meta_info_subdir}/PKG-INFO")
- 
--- 
-2.34.1
-
diff --git 
a/dynamic-layers/openembedded-layer/recipes-support/opencv/files/0002-cmake-Fix-overloaded-virtual-error.patch
 
b/dynamic-layers/openembedded-layer/recipes-support/opencv/files/0002-cmake-Fix-overloaded-virtual-error.patch
deleted file mode 100644
index 8a1464d5..00000000
--- 
a/dynamic-layers/openembedded-layer/recipes-support/opencv/files/0002-cmake-Fix-overloaded-virtual-error.patch
+++ /dev/null
@@ -1,33 +0,0 @@
-From 4a909a03b6dd336e7ea76e3f44d7cfb5d7e44798 Mon Sep 17 00:00:00 2001
-From: Anuj Mittal <[email protected]>
-Date: Wed, 29 Nov 2023 12:49:35 +0530
-Subject: [PATCH 2/3] cmake: Fix overloaded-virtual error
-
-* Remove -Werror for:
-|git/src/plugins/intel_gpu/src/kernel_selector/jitter.h:129:28: error: 
'virtual kernel_selector::JitDefinitions 
kernel_selector::JitConstant::GetDefinitions() const' was hidden 
[-Werror=overloaded-virtual=]
-|  129 |     virtual JitDefinitions GetDefinitions() const = 0;
-|      |
-
-Upstream-Status: Pending
-
-Signed-off-by: Anuj Mittal <[email protected]>
----
- src/plugins/intel_gpu/CMakeLists.txt | 2 +-
- 1 file changed, 1 insertion(+), 1 deletion(-)
-
-diff --git a/src/plugins/intel_gpu/CMakeLists.txt 
b/src/plugins/intel_gpu/CMakeLists.txt
-index 2f3d9127dde..2fd4f5c1b3c 100644
---- a/src/plugins/intel_gpu/CMakeLists.txt
-+++ b/src/plugins/intel_gpu/CMakeLists.txt
-@@ -47,7 +47,7 @@ add_subdirectory(thirdparty)
- include(thirdparty/cmake/rapidjson.cmake)
- 
- if(CMAKE_COMPILER_IS_GNUCXX)
--    ov_add_compiler_flags(-Werror)
-+      #ov_add_compiler_flags(-Werror)
- endif()
- 
- add_subdirectory(src/runtime)
--- 
-2.34.1
-
diff --git 
a/dynamic-layers/openembedded-layer/recipes-support/opencv/files/0003-protobuf-allow-target-protoc-to-be-built.patch
 
b/dynamic-layers/openembedded-layer/recipes-support/opencv/files/0003-protobuf-allow-target-protoc-to-be-built.patch
deleted file mode 100644
index bbdeaa2a..00000000
--- 
a/dynamic-layers/openembedded-layer/recipes-support/opencv/files/0003-protobuf-allow-target-protoc-to-be-built.patch
+++ /dev/null
@@ -1,45 +0,0 @@
-From 450d94b475460d1af32b207d0ced495794863f0d Mon Sep 17 00:00:00 2001
-From: Anuj Mittal <[email protected]>
-Date: Wed, 29 Nov 2023 12:55:19 +0530
-Subject: [PATCH 3/3] protobuf: allow target protoc to be built
-
-We can run target binaries using a qemu wrapper so allow these to be
-built and run.
-
-Upstream-Status: Inappropriate
-
-Signed-off-by: Anuj Mittal <[email protected]>
----
- cmake/developer_package/frontends/frontends.cmake | 2 +-
- thirdparty/protobuf/CMakeLists.txt                | 2 +-
- 2 files changed, 2 insertions(+), 2 deletions(-)
-
-diff --git a/cmake/developer_package/frontends/frontends.cmake 
b/cmake/developer_package/frontends/frontends.cmake
-index f3b5520d6d2..7579f638c5a 100644
---- a/cmake/developer_package/frontends/frontends.cmake
-+++ b/cmake/developer_package/frontends/frontends.cmake
-@@ -163,7 +163,7 @@ macro(ov_add_frontend)
-         set(OUTPUT_PB_HEADER 
${CMAKE_CURRENT_BINARY_DIR}/${relative_path}/${FILE_WE}.pb.h)
-         add_custom_command(
-                 OUTPUT "${OUTPUT_PB_SRC}" "${OUTPUT_PB_HEADER}"
--                COMMAND ${PROTOC_EXECUTABLE} ARGS --cpp_out 
${CMAKE_CURRENT_BINARY_DIR} -I ${protofiles_root_dir} ${proto_file}
-+                COMMAND protoc ARGS --cpp_out ${CMAKE_CURRENT_BINARY_DIR} -I 
${protofiles_root_dir} ${proto_file}
-                 DEPENDS ${PROTOC_DEPENDENCY} ${proto_file}
-                 COMMENT "Running C++ protocol buffer compiler 
(${PROTOC_EXECUTABLE}) on ${proto_file_relative}"
-                 VERBATIM
-diff --git a/thirdparty/protobuf/CMakeLists.txt 
b/thirdparty/protobuf/CMakeLists.txt
-index 15f32601f23..36853caf7dc 100644
---- a/thirdparty/protobuf/CMakeLists.txt
-+++ b/thirdparty/protobuf/CMakeLists.txt
-@@ -31,7 +31,7 @@ unset(HAVE_ZLIB CACHE)
- if(CMAKE_CROSSCOMPILING OR
-     (APPLE AND (HOST_X86_64 AND AARCH64)) OR
-     (MSVC AND (HOST_X86_64 AND (AARCH64 OR ARM))))
--    set(protobuf_BUILD_PROTOC_BINARIES OFF CACHE BOOL "Build protoc binaries" 
FORCE)
-+    set(protobuf_BUILD_PROTOC_BINARIES ON CACHE BOOL "Build protoc binaries" 
FORCE)
- else()
-     set(protobuf_BUILD_PROTOC_BINARIES ON CACHE BOOL "Build protoc binaries" 
FORCE)
- endif()
--- 
-2.34.1
-
diff --git 
a/dynamic-layers/openembedded-layer/recipes-support/opencv/open-model-zoo/0001-use-oe-gflags.patch
 
b/dynamic-layers/openembedded-layer/recipes-support/opencv/open-model-zoo/0001-use-oe-gflags.patch
deleted file mode 100644
index 816a98a3..00000000
--- 
a/dynamic-layers/openembedded-layer/recipes-support/opencv/open-model-zoo/0001-use-oe-gflags.patch
+++ /dev/null
@@ -1,27 +0,0 @@
-From 804b08023b3f8e72b8e3eb09e464d6775c11d966 Mon Sep 17 00:00:00 2001
-From: Naveen Saini <[email protected]>
-Date: Fri, 21 Oct 2022 11:38:23 +0800
-Subject: [PATCH] demos: use gflags from meta-oe
-
-Upstream-Status: Inappropriate
-
-Signed-off-by: Anuj Mittal <[email protected]>
-Signed-off-by: Naveen Saini <[email protected]>
-
----
- demos/CMakeLists.txt | 2 +-
- 1 file changed, 1 insertion(+), 1 deletion(-)
-
-diff --git a/demos/CMakeLists.txt b/demos/CMakeLists.txt
-index 51767051c..fb7e3d22f 100644
---- a/demos/CMakeLists.txt
-+++ b/demos/CMakeLists.txt
-@@ -141,7 +141,7 @@ endmacro()
- find_package(OpenCV REQUIRED COMPONENTS core highgui videoio imgproc 
imgcodecs)
- find_package(OpenVINO REQUIRED COMPONENTS Runtime)
- 
--add_subdirectory(thirdparty/gflags)
-+#add_subdirectory(thirdparty/gflags)
- add_subdirectory(common/cpp)
- 
- find_package(OpenCV QUIET COMPONENTS gapi)
diff --git 
a/dynamic-layers/openembedded-layer/recipes-support/opencv/open-model-zoo_2024.1.0.bb
 
b/dynamic-layers/openembedded-layer/recipes-support/opencv/open-model-zoo_2024.1.0.bb
deleted file mode 100644
index a9422e70..00000000
--- 
a/dynamic-layers/openembedded-layer/recipes-support/opencv/open-model-zoo_2024.1.0.bb
+++ /dev/null
@@ -1,54 +0,0 @@
-SUMMARY = "OpenVINO(TM) Toolkit - Open Model Zoo repository"
-HOMEPAGE = "https://github.com/opencv/open_model_zoo";
-DESCRIPTION = "This repository includes optimized deep learning \
-models and a set of demos to expedite development of high-performance \
-deep learning inference applications."
-
-SRC_URI = 
"git://github.com/opencv/open_model_zoo.git;protocol=https;branch=master \
-           file://0001-use-oe-gflags.patch \
-           "
-
-SRCREV = "cf5141dad2a4f24e1c5d5b9d43219ed804c48bbf"
-
-LICENSE = "Apache-2.0"
-LIC_FILES_CHKSUM = "file://LICENSE;md5=86d3f3a95c324c9479bd8986968f4327 \
-"
-
-inherit cmake
-
-S = "${WORKDIR}/git"
-OECMAKE_SOURCEPATH = "${S}/demos"
-
-DEPENDS += "openvino-inference-engine opencv gflags"
-
-RDEPENDS:${PN} += " \
-                   python3-decorator \
-                   python3-defusedxml \
-                   python3-networkx \
-                   python3-protobuf \
-                   python3-requests \
-                   python3-pyyaml \
-                   python3-numpy \
-                   bash \
-"
-
-COMPATIBLE_HOST = '(x86_64).*-linux'
-COMPATIBLE_HOST:libc-musl = "null"
-
-EXTRA_OECMAKE += " \
-                 -DENABLE_SAMPLES=ON \
-                 "
-
-do_install(){
-        install -d ${D}${libdir}
-        install -d ${D}${bindir}
-        install -d ${D}${datadir}/openvino/open-model-zoo/tools
-        install -d ${D}${datadir}/openvino/open-model-zoo/demos
-        cp -rf ${B}/intel64/Release/*.a ${D}${libdir}
-        cp -rf ${B}/intel64/Release/*_demo* ${D}${bindir}
-        cp -rf ${S}/models ${D}${datadir}/openvino/open-model-zoo
-        cp -rf ${S}/demos ${D}${datadir}/openvino/open-model-zoo
-        cp -rf ${S}/tools/model_tools 
${D}${datadir}/openvino/open-model-zoo/tools
-}
-
-FILES:${PN} += "${datadir}/openvino"
diff --git 
a/dynamic-layers/openembedded-layer/recipes-support/opencv/openvino-inference-engine_2024.1.0.bb
 
b/dynamic-layers/openembedded-layer/recipes-support/opencv/openvino-inference-engine_2024.1.0.bb
deleted file mode 100644
index 675d9920..00000000
--- 
a/dynamic-layers/openembedded-layer/recipes-support/opencv/openvino-inference-engine_2024.1.0.bb
+++ /dev/null
@@ -1,146 +0,0 @@
-SUMMARY = "OpenVINO(TM) Toolkit - Deep Learning Deployment Toolkit"
-HOMEPAGE = "https://github.com/opencv/dldt";
-DESCRIPTION = "This toolkit allows developers to deploy pre-trained \
-deep learning models through a high-level C++ Inference Engine API \
-integrated with application logic."
-
-SRC_URI = 
"git://github.com/openvinotoolkit/openvino.git;protocol=https;name=openvino;branch=releases/2024/1;lfs=0
 \
-           
git://github.com/openvinotoolkit/oneDNN.git;protocol=https;destsuffix=git/src/plugins/intel_cpu/thirdparty/onednn;name=mkl;nobranch=1
 \
-           
git://github.com/oneapi-src/oneDNN.git;protocol=https;destsuffix=git/src/plugins/intel_gpu/thirdparty/onednn_gpu;name=onednn;nobranch=1
 \
-           
git://github.com/herumi/xbyak.git;protocol=https;destsuffix=git/thirdparty/xbyak;name=xbyak;branch=master
 \
-           
git://github.com/nlohmann/json.git;protocol=https;destsuffix=git/thirdparty/json/nlohmann_json;name=json;branch=develop
 \
-           
git://github.com/opencv/ade.git;protocol=https;destsuffix=git/thirdparty/ade;name=ade;nobranch=1
 \
-           
git://github.com/protocolbuffers/protobuf.git;protocol=https;destsuffix=git/thirdparty/protobuf/protobuf;name=protobuf;branch=3.20.x
 \
-           
git://github.com/gflags/gflags.git;protocol=https;destsuffix=git/thirdparty/gflags/gflags;name=gflags;nobranch=1
 \
-           
git://github.com/madler/zlib.git;protocol=https;destsuffix=git/thirdparty/zlib/zlib;name=zlib;nobranch=1
 \
-           
git://github.com/openvinotoolkit/mlas.git;protocol=https;destsuffix=git/src/plugins/intel_cpu/thirdparty/mlas;name=mlas;nobranch=1
 \
-           
git://github.com/nodejs/node-api-headers.git;protocol=https;destsuffix=git/node-api-headers-src;name=node-api-headers;nobranch=1
 \
-           
git://github.com/nodejs/node-addon-api.git;protocol=https;destsuffix=git/node-addon-api-src;name=node-addon-api;nobranch=1
 \
-           
git://github.com/openvinotoolkit/telemetry.git;protocol=https;destsuffix=git/thirdparty/telemetry;name=telemetry;nobranch=1;lfs=0
 \
-           file://0001-cmake-yocto-specific-tweaks-to-the-build-process.patch \
-           file://0002-cmake-Fix-overloaded-virtual-error.patch \
-           file://0003-protobuf-allow-target-protoc-to-be-built.patch \
-           "
-
-SRCREV_openvino = "f4afc983258bcb2592d999ed6700043fdb58ad78"
-SRCREV_mkl = "26633ae49edd4353a29b7170d9fcef6b2d79f4b3"
-SRCREV_onednn = "4e6ff043c439652fcf6c400ac4e0c81bbac7c71c"
-SRCREV_xbyak = "740dff2e866f3ae1a70dd42d6e8836847ed95cc2"
-SRCREV_json = "9cca280a4d0ccf0c08f47a99aa71d1b0e52f8d03"
-SRCREV_ade = "0e8a2ccdd34f29dba55894f5f3c5179809888b9e"
-SRCREV_protobuf = "fe271ab76f2ad2b2b28c10443865d2af21e27e0e"
-SRCREV_gflags = "e171aa2d15ed9eb17054558e0b3a6a413bb01067"
-SRCREV_zlib = "09155eaa2f9270dc4ed1fa13e2b4b2613e6e4851"
-SRCREV_mlas = "d1bc25ec4660cddd87804fcf03b2411b5dfb2e94"
-SRCREV_node-api-headers = "186e04b5e40e54d7fd1655bc67081cc483f12488"
-SRCREV_node-addon-api = "39a25bf27788ff7a7ea5c64978c4dcd1e7b9d80d"
-SRCREV_telemetry = "58e16c257a512ec7f451c9fccf9ff455065b285b"
-SRCREV_FORMAT = 
"openvino_mkl_onednn_xbyak_json_ade_protobuf_gflags_zlib_node-api-headers_node-addon-api_mlas_telemetry"
-
-LICENSE = "Apache-2.0 & MIT & BSD-3-Clause & Zlib"
-LIC_FILES_CHKSUM = "file://LICENSE;md5=86d3f3a95c324c9479bd8986968f4327 \
-                    
file://thirdparty/xbyak/COPYRIGHT;md5=3c98edfaa50a86eeaef4c6109e803f16 \
-                    
file://thirdparty/cnpy/LICENSE;md5=689f10b06d1ca2d4b1057e67b16cd580 \
-                    
file://thirdparty/json/nlohmann_json/LICENSE.MIT;md5=f969127d7b7ed0a8a63c2bbeae002588
 \
-                    
file://thirdparty/ade/LICENSE;md5=3b83ef96387f14655fc854ddc3c6bd57 \
-                    
file://thirdparty/gflags/gflags/COPYING.txt;md5=c80d1a3b623f72bb85a4c75b556551df
 \
-                    
file://thirdparty/zlib/zlib/LICENSE;md5=b51a40671bc46e961c0498897742c0b8 \
-                    
file://src/plugins/intel_cpu/thirdparty/mlas/LICENSE;md5=86d3f3a95c324c9479bd8986968f4327
 \
-                    
file://src/plugins/intel_cpu/thirdparty/onednn/LICENSE;md5=3b64000f6e7d52516017622a37a94ce9
 \
-                    
file://src/plugins/intel_gpu/thirdparty/onednn_gpu/LICENSE;md5=3b64000f6e7d52516017622a37a94ce9
 \
-                    
file://node-api-headers-src/LICENSE;md5=6adb2909701d4605b4b2ae1a9b25d8bd \
-                    
file://node-addon-api-src/LICENSE.md;md5=0492ef29a9d558a3e9660e7accc9ca6a \
-                    
file://thirdparty/telemetry/LICENSE;md5=86d3f3a95c324c9479bd8986968f4327 \
-"
-
-inherit cmake python3native pkgconfig qemu
-
-S = "${WORKDIR}/git"
-EXTRA_OECMAKE += " \
-                  -DCMAKE_CROSSCOMPILING_EMULATOR=${WORKDIR}/qemuwrapper \
-                  -DENABLE_OPENCV=OFF \
-                  -DENABLE_INTEL_GNA=OFF \
-                  -DENABLE_SYSTEM_TBB=ON \
-                  -DPYTHON_EXECUTABLE=${PYTHON} \
-                  -DCMAKE_BUILD_TYPE=RelWithDebInfo \
-                  -DTHREADING=TBB -DTBB_DIR="${STAGING_LIBDIR}/cmake/TBB" \
-                  -DTREAT_WARNING_AS_ERROR=FALSE \
-                  -DENABLE_DATA=FALSE \
-                  -DENABLE_SYSTEM_PUGIXML=TRUE \
-                  -DENABLE_OV_ONNX_FRONTEND=FALSE \
-                  -DUSE_BUILD_TYPE_SUBFOLDER=OFF \
-                  -DENABLE_FUZZING=OFF \
-                  -DENABLE_TBBBIND_2_5=OFF \
-                  -DCPACK_GENERATOR=RPM \
-                  -DENABLE_SYSTEM_FLATBUFFERS=ON \
-                  -DENABLE_SYSTEM_SNAPPY=ON \
-                  -DFETCHCONTENT_BASE_DIR="${S}" \
-                  -DENABLE_INTEL_NPU=OFF \
-                  "
-
-DEPENDS += "\
-            flatbuffers-native \
-            pugixml \
-            python3-pybind11 \
-            python3-pybind11-native \
-            qemu-native \
-            snappy \
-            tbb \
-            "
-
-COMPATIBLE_HOST = '(x86_64).*-linux'
-COMPATIBLE_HOST:libc-musl = "null"
-
-PACKAGECONFIG ?= "opencl samples"
-PACKAGECONFIG[opencl] = "-DENABLE_INTEL_GPU=TRUE, -DENABLE_INTEL_GPU=FALSE, 
virtual/opencl-icd opencl-headers opencl-clhpp,"
-PACKAGECONFIG[python3] = "-DENABLE_PYTHON=ON 
-DPYTHON_LIBRARY=${PYTHON_LIBRARY} -DPYTHON_INCLUDE_DIR=${PYTHON_INCLUDE_DIR} 
-DENABLE_PYTHON_PACKAGING=ON, -DENABLE_PYTHON=OFF, patchelf-native, python3 
python3-numpy python3-progress"
-PACKAGECONFIG[samples] = "-DENABLE_SAMPLES=ON -DENABLE_COMPILE_TOOL=ON, 
-DENABLE_SAMPLES=OFF -DENABLE_COMPILE_TOOL=OFF, opencv"
-PACKAGECONFIG[verbose] = "-DVERBOSE_BUILD=1,-DVERBOSE_BUILD=0"
-
-do_configure:prepend() {
-    # Dont set PROJECT_ROOT_DIR
-    sed -i -e 's:\${OpenVINO_SOURCE_DIR}::;' ${S}/src/CMakeLists.txt
-
-    # qemu wrapper that can be used by cmake to run target binaries.
-    qemu_binary="${@qemu_wrapper_cmdline(d, d.getVar('STAGING_DIR_HOST'), 
[d.expand('${STAGING_DIR_HOST}${libdir}'),d.expand('${STAGING_DIR_HOST}${base_libdir}')])}"
-    cat > ${WORKDIR}/qemuwrapper << EOF
-#!/bin/sh
-$qemu_binary "\$@"
-EOF
-    chmod +x ${WORKDIR}/qemuwrapper
-}
-
-do_install:append() {
-    rm -rf ${D}${prefix}/install_dependencies
-    rm -rf ${D}${prefix}/setupvars.sh
-
-    find ${B}/src/plugins/intel_cpu/cross-compiled/ -type f -name *_disp.cpp 
-exec sed -i -e 's%'"${S}"'%'"${TARGET_DBGSRC_DIR}"'%g' {} +
-}
-
-# Otherwise e.g. ros-openvino-toolkit-dynamic-vino-sample when using 
dldt-inference-engine uses dldt-inference-engine WORKDIR
-# instead of RSS
-SSTATE_SCAN_FILES:append = " *.cmake"
-
-FILES:${PN} += "\
-                ${libdir}/openvino-${PV}/lib*${SOLIBSDEV} \
-                ${libdir}/openvino-${PV}/plugins.xml \
-                ${libdir}/openvino-${PV}/cache.json \
-                "
-
-# Move inference engine samples into a separate package
-PACKAGES =+ "${PN}-samples"
-
-FILES:${PN}-samples = "${datadir}/openvino \
-                       ${bindir} \
-                       ${libdir}/libformat_reader.a \
-                       ${libdir}/libopencv_c_wrapper.a \
-                       "
-
-RDEPENDS:${PN}-samples += "python3-core"
-
-# Package for inference engine python API
-PACKAGES =+ "${PN}-python3"
-
-FILES:${PN}-python3 = "${PYTHON_SITEPACKAGES_DIR}"
-
-UPSTREAM_CHECK_GITTAGREGEX = "(?P<pver>(\d+\.\d+\.\d+))$"
diff --git a/lib/oeqa/runtime/cases/dldt_inference_engine.py 
b/lib/oeqa/runtime/cases/dldt_inference_engine.py
deleted file mode 100644
index fb35d52f..00000000
--- a/lib/oeqa/runtime/cases/dldt_inference_engine.py
+++ /dev/null
@@ -1,109 +0,0 @@
-from oeqa.runtime.case import OERuntimeTestCase
-from oeqa.runtime.decorator.package import OEHasPackage
-from oeqa.core.decorator.depends import OETestDepends
-from oeqa.runtime.miutils.targets.oeqatarget import OEQATarget
-from oeqa.runtime.miutils.tests.squeezenet_model_download_test import 
SqueezenetModelDownloadTest
-from oeqa.runtime.miutils.tests.dldt_model_optimizer_test import 
DldtModelOptimizerTest
-from oeqa.runtime.miutils.tests.dldt_inference_engine_test import 
DldtInferenceEngineTest
-from oeqa.runtime.miutils.dldtutils import get_testdata_config
-
-class DldtInferenceEngine(OERuntimeTestCase):
-
-    @classmethod
-    def setUpClass(cls):
-        cls.sqn_download = 
SqueezenetModelDownloadTest(OEQATarget(cls.tc.target), '/tmp/ie/md')
-        cls.sqn_download.setup()
-        cls.dldt_mo = DldtModelOptimizerTest(OEQATarget(cls.tc.target), 
'/tmp/ie/ir')
-        cls.dldt_mo.setup()
-        cls.dldt_ie = DldtInferenceEngineTest(OEQATarget(cls.tc.target), 
'/tmp/ie/inputs')
-        cls.dldt_ie.setup()
-        cls.ir_files_dir = cls.dldt_mo.work_dir
-
-    @classmethod
-    def tearDownClass(cls):
-        cls.dldt_ie.tear_down()
-        cls.dldt_mo.tear_down()
-        cls.sqn_download.tear_down()
-
-    @OEHasPackage(['dldt-model-optimizer'])
-    @OEHasPackage(['wget'])
-    def test_dldt_ie_can_create_ir_and_download_input(self):
-        proxy_port = get_testdata_config(self.tc.td, 'DLDT_PIP_PROXY')
-        if not proxy_port:
-            self.skipTest('Need to configure bitbake configuration 
(DLDT_PIP_PROXY="proxy.server:port").')
-        (status, output) = 
self.sqn_download.test_can_download_squeezenet_model(proxy_port)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
-        (status, output) = 
self.sqn_download.test_can_download_squeezenet_prototxt(proxy_port)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
-
-        mo_exe_dir = get_testdata_config(self.tc.td, 'DLDT_MO_EXE_DIR')
-        if not mo_exe_dir:
-            self.skipTest('Need to configure bitbake configuration 
(DLDT_MO_EXE_DIR="directory_to_mo.py").')
-        mo_files_dir = self.sqn_download.work_dir
-        (status, output) = self.dldt_mo.test_dldt_mo_can_create_ir(mo_exe_dir, 
mo_files_dir)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
-
-        (status, output) = 
self.dldt_ie.test_can_download_input_file(proxy_port)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
-
-    
@OETestDepends(['dldt_inference_engine.DldtInferenceEngine.test_dldt_ie_can_create_ir_and_download_input'])
-    @OEHasPackage(['dldt-inference-engine'])
-    @OEHasPackage(['dldt-inference-engine-samples'])
-    def test_dldt_ie_classification_with_cpu(self):
-        (status, output) = 
self.dldt_ie.test_dldt_ie_classification_with_device('CPU', self.ir_files_dir)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
-
-    
@OETestDepends(['dldt_inference_engine.DldtInferenceEngine.test_dldt_ie_can_create_ir_and_download_input'])
-    @OEHasPackage(['dldt-inference-engine'])
-    @OEHasPackage(['dldt-inference-engine-samples'])
-    @OEHasPackage(['intel-compute-runtime'])
-    @OEHasPackage(['ocl-icd'])
-    def test_dldt_ie_classification_with_gpu(self):
-        (status, output) = 
self.dldt_ie.test_dldt_ie_classification_with_device('GPU', self.ir_files_dir)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
-
-    
@OETestDepends(['dldt_inference_engine.DldtInferenceEngine.test_dldt_ie_can_create_ir_and_download_input'])
-    @OEHasPackage(['dldt-inference-engine'])
-    @OEHasPackage(['dldt-inference-engine-samples'])
-    @OEHasPackage(['dldt-inference-engine-vpu-firmware'])
-    def test_dldt_ie_classification_with_myriad(self):
-        device = 'MYRIAD'
-        (status, output) = 
self.dldt_ie.test_check_if_openvino_device_available(device)
-        if not status:
-            self.skipTest('OpenVINO %s device not available on target 
machine(availalbe devices: %s)' % (device, output))
-        (status, output) = 
self.dldt_ie.test_dldt_ie_classification_with_device(device, self.ir_files_dir)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
-
-    
@OETestDepends(['dldt_inference_engine.DldtInferenceEngine.test_dldt_ie_can_create_ir_and_download_input'])
-    @OEHasPackage(['dldt-inference-engine'])
-    @OEHasPackage(['dldt-inference-engine-python3'])
-    @OEHasPackage(['python3-opencv'])
-    @OEHasPackage(['python3-numpy'])
-    def test_dldt_ie_classification_python_api_with_cpu(self):
-        (status, output) = 
self.dldt_ie.test_dldt_ie_classification_python_api_with_device('CPU', 
self.ir_files_dir)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
-
-    
@OETestDepends(['dldt_inference_engine.DldtInferenceEngine.test_dldt_ie_can_create_ir_and_download_input'])
-    @OEHasPackage(['dldt-inference-engine'])
-    @OEHasPackage(['dldt-inference-engine-python3'])
-    @OEHasPackage(['intel-compute-runtime'])
-    @OEHasPackage(['ocl-icd'])
-    @OEHasPackage(['python3-opencv'])
-    @OEHasPackage(['python3-numpy'])
-    def test_dldt_ie_classification_python_api_with_gpu(self):
-        (status, output) = 
self.dldt_ie.test_dldt_ie_classification_python_api_with_device('GPU', 
self.ir_files_dir)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
-
-    
@OETestDepends(['dldt_inference_engine.DldtInferenceEngine.test_dldt_ie_can_create_ir_and_download_input'])
-    @OEHasPackage(['dldt-inference-engine'])
-    @OEHasPackage(['dldt-inference-engine-python3'])
-    @OEHasPackage(['dldt-inference-engine-vpu-firmware'])
-    @OEHasPackage(['python3-opencv'])
-    @OEHasPackage(['python3-numpy'])
-    def test_dldt_ie_classification_python_api_with_myriad(self):
-        device = 'MYRIAD'
-        (status, output) = 
self.dldt_ie.test_check_if_openvino_device_available(device)
-        if not status:
-            self.skipTest('OpenVINO %s device not available on target 
machine(availalbe devices: %s)' % (device, output))
-        (status, output) = 
self.dldt_ie.test_dldt_ie_classification_python_api_with_device(device, 
self.ir_files_dir)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
diff --git a/lib/oeqa/runtime/cases/dldt_model_optimizer.py 
b/lib/oeqa/runtime/cases/dldt_model_optimizer.py
deleted file mode 100644
index 736ea661..00000000
--- a/lib/oeqa/runtime/cases/dldt_model_optimizer.py
+++ /dev/null
@@ -1,38 +0,0 @@
-from oeqa.runtime.case import OERuntimeTestCase
-from oeqa.runtime.decorator.package import OEHasPackage
-from oeqa.runtime.miutils.targets.oeqatarget import OEQATarget
-from oeqa.runtime.miutils.tests.squeezenet_model_download_test import 
SqueezenetModelDownloadTest
-from oeqa.runtime.miutils.tests.dldt_model_optimizer_test import 
DldtModelOptimizerTest
-from oeqa.runtime.miutils.dldtutils import get_testdata_config
-
-class DldtModelOptimizer(OERuntimeTestCase):
-
-    @classmethod
-    def setUpClass(cls):
-        cls.sqn_download = 
SqueezenetModelDownloadTest(OEQATarget(cls.tc.target), '/tmp/mo/md')
-        cls.sqn_download.setup()
-        cls.dldt_mo = DldtModelOptimizerTest(OEQATarget(cls.tc.target), 
'/tmp/mo/ir')
-        cls.dldt_mo.setup()
-
-    @classmethod
-    def tearDownClass(cls):
-        cls.dldt_mo.tear_down()
-        cls.sqn_download.tear_down()
-
-    @OEHasPackage(['dldt-model-optimizer'])
-    @OEHasPackage(['wget'])
-    def test_dldt_mo_can_create_ir(self):
-        proxy_port = get_testdata_config(self.tc.td, 'DLDT_PIP_PROXY')
-        if not proxy_port:
-            self.skipTest('Need to configure bitbake configuration 
(DLDT_PIP_PROXY="proxy.server:port").')
-        (status, output) = 
self.sqn_download.test_can_download_squeezenet_model(proxy_port)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
-        (status, output) = 
self.sqn_download.test_can_download_squeezenet_prototxt(proxy_port)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
-
-        mo_exe_dir = get_testdata_config(self.tc.td, 'DLDT_MO_EXE_DIR')
-        if not mo_exe_dir:
-            self.skipTest('Need to configure bitbake configuration 
(DLDT_MO_EXE_DIR="directory_to_mo.py").')
-        mo_files_dir = self.sqn_download.work_dir
-        (status, output) = self.dldt_mo.test_dldt_mo_can_create_ir(mo_exe_dir, 
mo_files_dir)
-        self.assertEqual(status, 0, msg='status and output: %s and %s' % 
(status, output))
diff --git 
a/lib/oeqa/runtime/files/dldt-inference-engine/classification_sample.py 
b/lib/oeqa/runtime/files/dldt-inference-engine/classification_sample.py
deleted file mode 100644
index 1906e9fe..00000000
--- a/lib/oeqa/runtime/files/dldt-inference-engine/classification_sample.py
+++ /dev/null
@@ -1,135 +0,0 @@
-#!/usr/bin/env python3
-"""
- Copyright (C) 2018-2019 Intel Corporation
-
- Licensed under the Apache License, Version 2.0 (the "License");
- you may not use this file except in compliance with the License.
- You may obtain a copy of the License at
-
-      http://www.apache.org/licenses/LICENSE-2.0
-
- Unless required by applicable law or agreed to in writing, software
- distributed under the License is distributed on an "AS IS" BASIS,
- WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
- See the License for the specific language governing permissions and
- limitations under the License.
-"""
-from __future__ import print_function
-import sys
-import os
-from argparse import ArgumentParser, SUPPRESS
-import cv2
-import numpy as np
-import logging as log
-from time import time
-from openvino.inference_engine import IENetwork, IECore
-
-
-def build_argparser():
-    parser = ArgumentParser(add_help=False)
-    args = parser.add_argument_group('Options')
-    args.add_argument('-h', '--help', action='help', default=SUPPRESS, 
help='Show this help message and exit.')
-    args.add_argument("-m", "--model", help="Required. Path to an .xml file 
with a trained model.", required=True,
-                      type=str)
-    args.add_argument("-i", "--input", help="Required. Path to a folder with 
images or path to an image files",
-                      required=True,
-                      type=str, nargs="+")
-    args.add_argument("-l", "--cpu_extension",
-                      help="Optional. Required for CPU custom layers. "
-                           "MKLDNN (CPU)-targeted custom layers. Absolute path 
to a shared library with the"
-                           " kernels implementations.", type=str, default=None)
-    args.add_argument("-d", "--device",
-                      help="Optional. Specify the target device to infer on; 
CPU, GPU, FPGA, HDDL, MYRIAD or HETERO: is "
-                           "acceptable. The sample will look for a suitable 
plugin for device specified. Default "
-                           "value is CPU",
-                      default="CPU", type=str)
-    args.add_argument("--labels", help="Optional. Path to a labels mapping 
file", default=None, type=str)
-    args.add_argument("-nt", "--number_top", help="Optional. Number of top 
results", default=10, type=int)
-
-    return parser
-
-
-def main():
-    log.basicConfig(format="[ %(levelname)s ] %(message)s", level=log.INFO, 
stream=sys.stdout)
-    args = build_argparser().parse_args()
-    model_xml = args.model
-    model_bin = os.path.splitext(model_xml)[0] + ".bin"
-
-    # Plugin initialization for specified device and load extensions library 
if specified
-    log.info("Creating Inference Engine")
-    ie = IECore()
-    if args.cpu_extension and 'CPU' in args.device:
-        ie.add_extension(args.cpu_extension, "CPU")
-    # Read IR
-    log.info("Loading network files:\n\t{}\n\t{}".format(model_xml, model_bin))
-    net = IENetwork(model=model_xml, weights=model_bin)
-
-    if "CPU" in args.device:
-        supported_layers = ie.query_network(net, "CPU")
-        not_supported_layers = [l for l in net.layers.keys() if l not in 
supported_layers]
-        if len(not_supported_layers) != 0:
-            log.error("Following layers are not supported by the plugin for 
specified device {}:\n {}".
-                      format(args.device, ', '.join(not_supported_layers)))
-            log.error("Please try to specify cpu extensions library path in 
sample's command line parameters using -l "
-                      "or --cpu_extension command line argument")
-            sys.exit(1)
-
-    assert len(net.inputs.keys()) == 1, "Sample supports only single input 
topologies"
-    assert len(net.outputs) == 1, "Sample supports only single output 
topologies"
-
-    log.info("Preparing input blobs")
-    input_blob = next(iter(net.inputs))
-    out_blob = next(iter(net.outputs))
-    net.batch_size = len(args.input)
-
-    # Read and pre-process input images
-    n, c, h, w = net.inputs[input_blob].shape
-    images = np.ndarray(shape=(n, c, h, w))
-    for i in range(n):
-        image = cv2.imread(args.input[i])
-        if image.shape[:-1] != (h, w):
-            log.warning("Image {} is resized from {} to 
{}".format(args.input[i], image.shape[:-1], (h, w)))
-            image = cv2.resize(image, (w, h))
-        image = image.transpose((2, 0, 1))  # Change data layout from HWC to 
CHW
-        images[i] = image
-    log.info("Batch size is {}".format(n))
-
-    # Loading model to the plugin
-    log.info("Loading model to the plugin")
-    exec_net = ie.load_network(network=net, device_name=args.device)
-
-    # Start sync inference
-    log.info("Starting inference in synchronous mode")
-    res = exec_net.infer(inputs={input_blob: images})
-
-    # Processing output blob
-    log.info("Processing output blob")
-    res = res[out_blob]
-    log.info("Top {} results: ".format(args.number_top))
-    if args.labels:
-        with open(args.labels, 'r') as f:
-            labels_map = [x.split(sep=' ', maxsplit=1)[-1].strip() for x in f]
-    else:
-        labels_map = None
-    classid_str = "classid"
-    probability_str = "probability"
-    for i, probs in enumerate(res):
-        probs = np.squeeze(probs)
-        top_ind = np.argsort(probs)[-args.number_top:][::-1]
-        print("Image {}\n".format(args.input[i]))
-        print(classid_str, probability_str)
-        print("{} {}".format('-' * len(classid_str), '-' * 
len(probability_str)))
-        for id in top_ind:
-            det_label = labels_map[id] if labels_map else "{}".format(id)
-            label_length = len(det_label)
-            space_num_before = (len(classid_str) - label_length) // 2
-            space_num_after = len(classid_str) - (space_num_before + 
label_length) + 2
-            space_num_before_prob = (len(probability_str) - 
len(str(probs[id]))) // 2
-            print("{}{}{}{}{:.7f}".format(' ' * space_num_before, det_label,
-                                          ' ' * space_num_after, ' ' * 
space_num_before_prob,
-                                          probs[id]))
-        print("\n")
-    log.info("This sample is an API example, for any performance measurements 
please use the dedicated benchmark_app tool\n")
-
-if __name__ == '__main__':
-    sys.exit(main() or 0)
diff --git a/lib/oeqa/runtime/miutils/dldtutils.py 
b/lib/oeqa/runtime/miutils/dldtutils.py
deleted file mode 100644
index 45bf2e12..00000000
--- a/lib/oeqa/runtime/miutils/dldtutils.py
+++ /dev/null
@@ -1,3 +0,0 @@
-
-def get_testdata_config(testdata, config):
-    return testdata.get(config)
diff --git a/lib/oeqa/runtime/miutils/tests/dldt_inference_engine_test.py 
b/lib/oeqa/runtime/miutils/tests/dldt_inference_engine_test.py
deleted file mode 100644
index 31bfb539..00000000
--- a/lib/oeqa/runtime/miutils/tests/dldt_inference_engine_test.py
+++ /dev/null
@@ -1,56 +0,0 @@
-import os
-script_path = os.path.dirname(os.path.realpath(__file__))
-files_path = os.path.join(script_path, '../../files/')
-
-class DldtInferenceEngineTest(object):
-    ie_input_files = {'ie_python_sample': 'classification_sample.py',
-                      'input': 'chicky_512.png',
-                      'input_download': 
'https://raw.githubusercontent.com/opencv/opencv/master/samples/data/chicky_512.png',
-                      'model': 'squeezenet_v1.1.xml'}
-
-    def __init__(self, target, work_dir):
-        self.target = target
-        self.work_dir = work_dir
-
-    def setup(self):
-        self.target.run('mkdir -p %s' % self.work_dir)
-        self.target.copy_to(os.path.join(files_path, 'dldt-inference-engine', 
self.ie_input_files['ie_python_sample']),
-                            self.work_dir)
-        python_cmd = 'from openvino.inference_engine import IENetwork, IECore; 
ie = IECore(); print(ie.available_devices)'
-        __, output = self.target.run('python3 -c "%s"' % python_cmd)
-        self.available_devices = output
-
-    def tear_down(self):
-        self.target.run('rm -rf %s' % self.work_dir)
-
-    def test_check_if_openvino_device_available(self, device):
-        if device not in self.available_devices:
-            return False, self.available_devices
-        return True, self.available_devices
-
-    def test_can_download_input_file(self, proxy_port):
-        return self.target.run('cd %s; wget %s -e https_proxy=%s' %
-                               (self.work_dir,
-                                self.ie_input_files['input_download'],
-                                proxy_port))
-
-    def test_dldt_ie_classification_with_device(self, device, ir_files_dir):
-        return self.target.run('classification_sample_async -d %s -i %s -m %s' 
%
-                               (device,
-                                os.path.join(self.work_dir, 
self.ie_input_files['input']),
-                                os.path.join(ir_files_dir, 
self.ie_input_files['model'])))
-
-    def test_dldt_ie_classification_python_api_with_device(self, device, 
ir_files_dir, extension=''):
-        if extension:
-            return self.target.run('python3 %s -d %s -i %s -m %s -l %s' %
-                                   (os.path.join(self.work_dir, 
self.ie_input_files['ie_python_sample']),
-                                    device,
-                                    os.path.join(self.work_dir, 
self.ie_input_files['input']),
-                                    os.path.join(ir_files_dir, 
self.ie_input_files['model']),
-                                    extension))
-        else:
-            return self.target.run('python3 %s -d %s -i %s -m %s' %
-                                   (os.path.join(self.work_dir, 
self.ie_input_files['ie_python_sample']),
-                                    device,
-                                    os.path.join(self.work_dir, 
self.ie_input_files['input']),
-                                    os.path.join(ir_files_dir, 
self.ie_input_files['model'])))
diff --git a/lib/oeqa/runtime/miutils/tests/dldt_model_optimizer_test.py 
b/lib/oeqa/runtime/miutils/tests/dldt_model_optimizer_test.py
deleted file mode 100644
index 7d3db15b..00000000
--- a/lib/oeqa/runtime/miutils/tests/dldt_model_optimizer_test.py
+++ /dev/null
@@ -1,23 +0,0 @@
-import os
-
-class DldtModelOptimizerTest(object):
-    mo_input_files = {'model': 'squeezenet_v1.1.caffemodel',
-                      'prototxt': 'deploy.prototxt'}
-    mo_exe = 'mo.py'
-
-    def __init__(self, target, work_dir):
-        self.target = target
-        self.work_dir = work_dir
-
-    def setup(self):
-        self.target.run('mkdir -p %s' % self.work_dir)
-
-    def tear_down(self):
-        self.target.run('rm -rf %s' % self.work_dir)
-
-    def test_dldt_mo_can_create_ir(self, mo_exe_dir, mo_files_dir):
-        return self.target.run('python3 %s --input_model %s --input_proto %s 
--output_dir %s --data_type FP16' %
-                               (os.path.join(mo_exe_dir, self.mo_exe),
-                                os.path.join(mo_files_dir, 
self.mo_input_files['model']),
-                                os.path.join(mo_files_dir, 
self.mo_input_files['prototxt']),
-                                self.work_dir))
diff --git a/lib/oeqa/runtime/miutils/tests/squeezenet_model_download_test.py 
b/lib/oeqa/runtime/miutils/tests/squeezenet_model_download_test.py
deleted file mode 100644
index a3e46a0a..00000000
--- a/lib/oeqa/runtime/miutils/tests/squeezenet_model_download_test.py
+++ /dev/null
@@ -1,25 +0,0 @@
-class SqueezenetModelDownloadTest(object):
-    download_files = {'squeezenet1.1.prototxt': 
'https://raw.githubusercontent.com/DeepScale/SqueezeNet/a47b6f13d30985279789d08053d37013d67d131b/SqueezeNet_v1.1/deploy.prototxt',
-                      'squeezenet1.1.caffemodel': 
'https://github.com/DeepScale/SqueezeNet/raw/a47b6f13d30985279789d08053d37013d67d131b/SqueezeNet_v1.1/squeezenet_v1.1.caffemodel'}
-
-    def __init__(self, target, work_dir):
-        self.target = target
-        self.work_dir = work_dir
-
-    def setup(self):
-        self.target.run('mkdir -p %s' % self.work_dir)
-
-    def tear_down(self):
-        self.target.run('rm -rf %s' % self.work_dir)
-
-    def test_can_download_squeezenet_model(self, proxy_port):
-        return self.target.run('cd %s; wget %s -e https_proxy=%s' %
-                               (self.work_dir,
-                                
self.download_files['squeezenet1.1.caffemodel'],
-                                proxy_port))
-
-    def test_can_download_squeezenet_prototxt(self, proxy_port):
-        return self.target.run('cd %s; wget %s -e https_proxy=%s' %
-                               (self.work_dir,
-                                self.download_files['squeezenet1.1.prototxt'],
-                                proxy_port))
-- 
2.34.1

-=-=-=-=-=-=-=-=-=-=-=-
Links: You receive all messages sent to this group.
View/Reply Online (#8315): 
https://lists.yoctoproject.org/g/meta-intel/message/8315
Mute This Topic: https://lists.yoctoproject.org/mt/105974008/21656
Group Owner: [email protected]
Unsubscribe: https://lists.yoctoproject.org/g/meta-intel/unsub 
[[email protected]]
-=-=-=-=-=-=-=-=-=-=-=-

Reply via email to