{"id":23948,"date":"2020-02-10T15:33:42","date_gmt":"2020-02-10T23:33:42","guid":{"rendered":"https:\/\/www.edge-ai-vision.com\/?page_id=23948"},"modified":"2023-08-24T13:18:16","modified_gmt":"2023-08-24T20:18:16","slug":"resources","status":"publish","type":"page","link":"https:\/\/www.edge-ai-vision.com\/resources\/","title":{"rendered":"Resources"},"content":{"rendered":"\t\t<div data-elementor-type=\"wp-page\" data-elementor-id=\"23948\" class=\"elementor elementor-23948\" data-elementor-post-type=\"page\">\n\t\t\t\t\t\t<section data-particle_enable=\"false\" data-particle-mobile-disabled=\"false\" class=\"elementor-section elementor-top-section elementor-element elementor-element-1510a60f elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"1510a60f\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-7fda3c1d\" data-id=\"7fda3c1d\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-25b7da04 elementor-widget elementor-widget-heading\" data-id=\"25b7da04\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Resources<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5296b278 elementor-widget elementor-widget-heading\" data-id=\"5296b278\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h4 class=\"elementor-heading-title elementor-size-default\">In-depth information about the edge AI and vision applications, technologies, products, markets and trends.<\/h4>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-278f66b4 elementor-widget elementor-widget-text-editor\" data-id=\"278f66b4\" data-element_type=\"widget\" data-widget_type=\"text-editor.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<p>The content in this section of the website comes from Edge AI and Vision Alliance members and other industry luminaries.<\/p>\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section data-particle_enable=\"false\" data-particle-mobile-disabled=\"false\" class=\"elementor-section elementor-top-section elementor-element elementor-element-323ed039 elementor-section-stretched elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"323ed039\" data-element_type=\"section\" data-settings=\"{&quot;background_background&quot;:&quot;classic&quot;,&quot;shape_divider_top&quot;:&quot;opacity-tilt&quot;,&quot;shape_divider_bottom&quot;:&quot;opacity-tilt&quot;,&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t\t<div class=\"elementor-background-overlay\"><\/div>\n\t\t\t\t\t\t<div class=\"elementor-shape elementor-shape-top\" aria-hidden=\"true\" data-negative=\"false\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 2600 131.1\" preserveAspectRatio=\"none\">\n\t<path class=\"elementor-shape-fill\" d=\"M0 0L2600 0 2600 69.1 0 0z\"\/>\n\t<path class=\"elementor-shape-fill\" style=\"opacity:0.5\" d=\"M0 0L2600 0 2600 69.1 0 69.1z\"\/>\n\t<path class=\"elementor-shape-fill\" style=\"opacity:0.25\" d=\"M2600 0L0 0 0 130.1 2600 69.1z\"\/>\n<\/svg>\t\t<\/div>\n\t\t\t\t<div class=\"elementor-shape elementor-shape-bottom\" aria-hidden=\"true\" data-negative=\"false\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 2600 131.1\" preserveAspectRatio=\"none\">\n\t<path class=\"elementor-shape-fill\" d=\"M0 0L2600 0 2600 69.1 0 0z\"\/>\n\t<path class=\"elementor-shape-fill\" style=\"opacity:0.5\" d=\"M0 0L2600 0 2600 69.1 0 69.1z\"\/>\n\t<path class=\"elementor-shape-fill\" style=\"opacity:0.25\" d=\"M2600 0L0 0 0 130.1 2600 69.1z\"\/>\n<\/svg>\t\t<\/div>\n\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-wider\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-33 elementor-top-column elementor-element elementor-element-5c35ed9b\" data-id=\"5c35ed9b\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-75e48ca7 joinNowBtn elementor-widget elementor-widget-button\" data-id=\"75e48ca7\" data-element_type=\"widget\" data-widget_type=\"button.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<div class=\"elementor-button-wrapper\">\n\t\t\t\t\t<a class=\"elementor-button elementor-button-link elementor-size-xl\" href=\"\/resources\/technologies\">\n\t\t\t\t\t\t<span class=\"elementor-button-content-wrapper\">\n\t\t\t\t\t\t\t\t\t<span class=\"elementor-button-text\">TECHNOLOGIES<\/span>\n\t\t\t\t\t<\/span>\n\t\t\t\t\t<\/a>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t<div class=\"elementor-column elementor-col-33 elementor-top-column elementor-element elementor-element-385fe108\" data-id=\"385fe108\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-78ce9b9 joinNowBtn elementor-widget elementor-widget-button\" data-id=\"78ce9b9\" data-element_type=\"widget\" data-widget_type=\"button.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<div class=\"elementor-button-wrapper\">\n\t\t\t\t\t<a class=\"elementor-button elementor-button-link elementor-size-xl\" href=\"\/resources\/applications\">\n\t\t\t\t\t\t<span class=\"elementor-button-content-wrapper\">\n\t\t\t\t\t\t\t\t\t<span class=\"elementor-button-text\">APPLICATIONS<\/span>\n\t\t\t\t\t<\/span>\n\t\t\t\t\t<\/a>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t<div class=\"elementor-column elementor-col-33 elementor-top-column elementor-element elementor-element-2b50914a\" data-id=\"2b50914a\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-6741225b joinNowBtn elementor-widget elementor-widget-button\" data-id=\"6741225b\" data-element_type=\"widget\" data-widget_type=\"button.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<div class=\"elementor-button-wrapper\">\n\t\t\t\t\t<a class=\"elementor-button elementor-button-link elementor-size-xl\" href=\"\/resources\/functions\">\n\t\t\t\t\t\t<span class=\"elementor-button-content-wrapper\">\n\t\t\t\t\t\t\t\t\t<span class=\"elementor-button-text\">FUNCTIONS<\/span>\n\t\t\t\t\t<\/span>\n\t\t\t\t\t<\/a>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section data-particle_enable=\"false\" data-particle-mobile-disabled=\"false\" class=\"elementor-section elementor-top-section elementor-element elementor-element-51f5aee9 elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"51f5aee9\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-wider\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-100 elementor-top-column elementor-element elementor-element-4bed7d73\" data-id=\"4bed7d73\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-3ee500aa elementor-widget elementor-widget-heading\" data-id=\"3ee500aa\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">All Resources<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-709cbd49 elementor-widget-divider--view-line elementor-widget elementor-widget-divider\" data-id=\"709cbd49\" data-element_type=\"widget\" data-widget_type=\"divider.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-divider\">\n\t\t\t<span class=\"elementor-divider-separator\">\n\t\t\t\t\t\t<\/span>\n\t\t<\/div>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-37cbe5f0 elementor-grid-3 elementor-grid-tablet-2 elementor-grid-mobile-1 elementor-posts--thumbnail-top elementor-widget elementor-widget-posts\" data-id=\"37cbe5f0\" data-element_type=\"widget\" data-settings=\"{&quot;pagination_type&quot;:&quot;numbers_and_prev_next&quot;,&quot;classic_columns&quot;:&quot;3&quot;,&quot;classic_columns_tablet&quot;:&quot;2&quot;,&quot;classic_columns_mobile&quot;:&quot;1&quot;,&quot;classic_row_gap&quot;:{&quot;unit&quot;:&quot;px&quot;,&quot;size&quot;:35,&quot;sizes&quot;:[]},&quot;classic_row_gap_tablet&quot;:{&quot;unit&quot;:&quot;px&quot;,&quot;size&quot;:&quot;&quot;,&quot;sizes&quot;:[]},&quot;classic_row_gap_mobile&quot;:{&quot;unit&quot;:&quot;px&quot;,&quot;size&quot;:&quot;&quot;,&quot;sizes&quot;:[]}}\" data-widget_type=\"posts.classic\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-posts-container elementor-posts elementor-posts--skin-classic elementor-grid\" role=\"list\">\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56832 post type-post status-publish format-standard hentry category-memory category-videos\" role=\"listitem\">\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/january-2026-dram-market-update\/\" >\n\t\t\t\tJanuary 2026 DRAM Market Update\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 14, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/january-2026-dram-market-update\/\" aria-label=\"Read more about January 2026 DRAM Market Update\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56804 post type-post status-publish format-standard has-post-thumbnail hentry category-blog category-e-con-systems category-sensors category-sony-electronics\" role=\"listitem\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/sony-pregius-imx264-vs-imx568-a-detailed-sensor-comparison-guide\/\" tabindex=\"-1\" >\n\t\t\t<div class=\"elementor-post__thumbnail\"><img fetchpriority=\"high\" decoding=\"async\" width=\"300\" height=\"200\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Sony-Pregius-IMX264-vs.-IMX568-A-Detailed-Sensor-Comparison-Guide-300x200.jpg\" class=\"attachment-medium size-medium wp-image-56805\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Sony-Pregius-IMX264-vs.-IMX568-A-Detailed-Sensor-Comparison-Guide-300x200.jpg 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Sony-Pregius-IMX264-vs.-IMX568-A-Detailed-Sensor-Comparison-Guide-1024x683.jpg 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Sony-Pregius-IMX264-vs.-IMX568-A-Detailed-Sensor-Comparison-Guide-768x512.jpg 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Sony-Pregius-IMX264-vs.-IMX568-A-Detailed-Sensor-Comparison-Guide.jpg 1050w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div>\n\t\t<\/a>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/sony-pregius-imx264-vs-imx568-a-detailed-sensor-comparison-guide\/\" >\n\t\t\t\tSony Pregius IMX264 vs. IMX568: A Detailed Sensor Comparison Guide\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 13, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>This blog post was originally published at\u00a0e-con Systems\u2019 website. It is reprinted here with the permission of e-con Systems. The image sensor is an important component in defining the camera\u2019s image quality. Many real-world applications<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/sony-pregius-imx264-vs-imx568-a-detailed-sensor-comparison-guide\/\" aria-label=\"Read more about Sony Pregius IMX264 vs. IMX568: A Detailed Sensor Comparison Guide\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56801 post type-post status-publish format-standard has-post-thumbnail hentry category-blog category-industrial-vision-computer-vision category-lincode\" role=\"listitem\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/what-happens-when-the-inspection-ai-fails-learning-from-production-line-mistakes\/\" tabindex=\"-1\" >\n\t\t\t<div class=\"elementor-post__thumbnail\"><img decoding=\"async\" width=\"300\" height=\"157\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/What-Happens-When-the-Inspection-AI-Fails-Learning-from-Production-Line-Mistakes-300x157.png\" class=\"attachment-medium size-medium wp-image-56802\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/What-Happens-When-the-Inspection-AI-Fails-Learning-from-Production-Line-Mistakes-300x157.png 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/What-Happens-When-the-Inspection-AI-Fails-Learning-from-Production-Line-Mistakes-1024x536.png 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/What-Happens-When-the-Inspection-AI-Fails-Learning-from-Production-Line-Mistakes-768x402.png 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/What-Happens-When-the-Inspection-AI-Fails-Learning-from-Production-Line-Mistakes.png 1200w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div>\n\t\t<\/a>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/what-happens-when-the-inspection-ai-fails-learning-from-production-line-mistakes\/\" >\n\t\t\t\tWhat Happens When the Inspection AI Fails: Learning from Production Line Mistakes\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 12, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>This blog post was originally published at\u00a0Lincode\u2019s website. It is reprinted here with the permission of Lincode. Studies show that\u00a0about 34% of manufacturing defects are missed because inspection systems make mistakes.[1] These numbers show a<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/what-happens-when-the-inspection-ai-fails-learning-from-production-line-mistakes\/\" aria-label=\"Read more about What Happens When the Inspection AI Fails: Learning from Production Line Mistakes\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56822 post type-post status-publish format-standard has-post-thumbnail hentry category-mipi-alliance category-news category-sensors category-tools\" role=\"listitem\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/upcoming-webinar-on-csi-2-over-d-phy-c-phy\/\" tabindex=\"-1\" >\n\t\t\t<div class=\"elementor-post__thumbnail\"><img decoding=\"async\" width=\"300\" height=\"200\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Webinar-CSI-2-Over-C-D-PHY-Feb2026-Website-LI-NoCTA-300x200.png\" class=\"attachment-medium size-medium wp-image-56823\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Webinar-CSI-2-Over-C-D-PHY-Feb2026-Website-LI-NoCTA-300x200.png 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Webinar-CSI-2-Over-C-D-PHY-Feb2026-Website-LI-NoCTA-768x512.png 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Webinar-CSI-2-Over-C-D-PHY-Feb2026-Website-LI-NoCTA.png 1000w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div>\n\t\t<\/a>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/upcoming-webinar-on-csi-2-over-d-phy-c-phy\/\" >\n\t\t\t\tUpcoming Webinar on CSI-2 over D-PHY &#038; C-PHY\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 11, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>On February 24, 2026, at 9:00 am PST (12:00 pm EST) MIPI Alliance will deliver a webinar \u201cMIPI CSI-2 over D-PHY &amp; C-PHY: Advancing Imaging Conduit Solutions\u201d From the event page: MIPI CSI-2\u00ae, together with<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/upcoming-webinar-on-csi-2-over-d-phy-c-phy\/\" aria-label=\"Read more about Upcoming Webinar on CSI-2 over D-PHY &#038; C-PHY\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56797 post type-post status-publish format-standard has-post-thumbnail hentry category-blog category-mipi-alliance category-sensors category-tools\" role=\"listitem\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/whats-new-in-mipi-security-mipi-ccise-and-security-for-debug\/\" tabindex=\"-1\" >\n\t\t\t<div class=\"elementor-post__thumbnail\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"200\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Security-Recap-2025-Blog5-300x200.png\" class=\"attachment-medium size-medium wp-image-56798\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Security-Recap-2025-Blog5-300x200.png 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Security-Recap-2025-Blog5-768x512.png 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Security-Recap-2025-Blog5.png 1000w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div>\n\t\t<\/a>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/whats-new-in-mipi-security-mipi-ccise-and-security-for-debug\/\" >\n\t\t\t\tWhat\u2019s New in MIPI Security: MIPI CCISE and Security for Debug\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 11, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>This blog post was originally published at\u00a0MIPI Alliance\u2019s website. It is reprinted here with the permission of MIPI Alliance. As the need for security becomes increasingly more critical, MIPI Alliance has continued to broaden its<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/whats-new-in-mipi-security-mipi-ccise-and-security-for-debug\/\" aria-label=\"Read more about What\u2019s New in MIPI Security: MIPI CCISE and Security for Debug\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56811 post type-post status-publish format-standard has-post-thumbnail hentry category-microchip-technology category-news category-software category-tools\" role=\"listitem\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/production-ready-full-stack-edge-ai-solutions-turn-microchips-mcus-and-mpus-into-catalysts-for-intelligent-real-time-decision-making\/\" tabindex=\"-1\" >\n\t\t\t<div class=\"elementor-post__thumbnail\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"167\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/55062918660_4b00866c20_o-300x167.jpg\" class=\"attachment-medium size-medium wp-image-56812\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/55062918660_4b00866c20_o-300x167.jpg 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/55062918660_4b00866c20_o-1024x569.jpg 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/55062918660_4b00866c20_o-768x427.jpg 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/55062918660_4b00866c20_o-1536x853.jpg 1536w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/55062918660_4b00866c20_o-2048x1138.jpg 2048w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div>\n\t\t<\/a>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/production-ready-full-stack-edge-ai-solutions-turn-microchips-mcus-and-mpus-into-catalysts-for-intelligent-real-time-decision-making\/\" >\n\t\t\t\tProduction-Ready, Full-Stack Edge AI Solutions Turn Microchip\u2019s MCUs and MPUs Into Catalysts for Intelligent Real-Time Decision-Making\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 10, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>Chandler, Ariz., February 10, 2026 \u2014\u00a0A major next step for artificial intelligence (AI) and machine learning (ML) innovation is moving ML models from the cloud to the edge for real-time inferencing and decision-making applications in<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/production-ready-full-stack-edge-ai-solutions-turn-microchips-mcus-and-mpus-into-catalysts-for-intelligent-real-time-decision-making\/\" aria-label=\"Read more about Production-Ready, Full-Stack Edge AI Solutions Turn Microchip\u2019s MCUs and MPUs Into Catalysts for Intelligent Real-Time Decision-Making\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56795 post type-post status-publish format-standard has-post-thumbnail hentry category-automotive category-blog category-software category-texas-instruments category-tools\" role=\"listitem\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/accelerating-next-generation-automotive-designs-with-the-tda5-virtualizer-development-kit\/\" tabindex=\"-1\" >\n\t\t\t<div class=\"elementor-post__thumbnail\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"168\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-300x168.jpg\" class=\"attachment-medium size-medium wp-image-12399\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-300x168.jpg 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-1024x576.jpg 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-768x432.jpg 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-1536x864.jpg 1536w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-2048x1152.jpg 2048w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-500x281.jpg 500w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments.jpg 1200w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div>\n\t\t<\/a>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/accelerating-next-generation-automotive-designs-with-the-tda5-virtualizer-development-kit\/\" >\n\t\t\t\tAccelerating next-generation automotive designs with the TDA5 Virtualizer\u2122 Development Kit\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 10, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>This blog post was originally published at\u00a0Texas Instruments\u2019 website. It is reprinted here with the permission of Texas Instruments. Introduction Continuous innovation in high-performance, power-efficient systems-on-a-chip (SoCs) is enabling safer, smarter and more autonomous driving<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/accelerating-next-generation-automotive-designs-with-the-tda5-virtualizer-development-kit\/\" aria-label=\"Read more about Accelerating next-generation automotive designs with the TDA5 Virtualizer\u2122 Development Kit\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56608 post type-post status-publish format-standard has-post-thumbnail hentry category-automotive category-blog category-nvidia category-robotics category-software category-tools\" role=\"listitem\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\" tabindex=\"-1\" >\n\t\t\t<div class=\"elementor-post__thumbnail\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"159\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai-300x159.png\" class=\"attachment-medium size-medium wp-image-56609\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai-300x159.png 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai-1024x544.png 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai-768x408.png 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png 1280w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div>\n\t\t<\/a>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\" >\n\t\t\t\tInto the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 9, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>This blog post was originally published at NVIDIA\u2019s website. It is reprinted here with the permission of NVIDIA. NVIDIA Editor\u2019s note: This post is part of Into the Omniverse, a series focused on how developers,<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\" aria-label=\"Read more about Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56689 post type-post status-publish format-standard has-post-thumbnail hentry category-automotive category-blog category-e-con-systems category-nvidia category-robotics category-sensors\" role=\"listitem\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/what-sensor-fusion-architecture-offers-for-nvidia-orin-nx-based-autonomous-vision-systems\/\" tabindex=\"-1\" >\n\t\t\t<div class=\"elementor-post__thumbnail\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"200\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/What-Sensor-Fusion-Architecture-Offers-for-NVIDIA-Orin-NX-Based-Autonomous-Vision-Systems-300x200.jpg\" class=\"attachment-medium size-medium wp-image-56690\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/What-Sensor-Fusion-Architecture-Offers-for-NVIDIA-Orin-NX-Based-Autonomous-Vision-Systems-300x200.jpg 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/What-Sensor-Fusion-Architecture-Offers-for-NVIDIA-Orin-NX-Based-Autonomous-Vision-Systems-1024x683.jpg 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/What-Sensor-Fusion-Architecture-Offers-for-NVIDIA-Orin-NX-Based-Autonomous-Vision-Systems-768x512.jpg 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/What-Sensor-Fusion-Architecture-Offers-for-NVIDIA-Orin-NX-Based-Autonomous-Vision-Systems.jpg 1050w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div>\n\t\t<\/a>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/what-sensor-fusion-architecture-offers-for-nvidia-orin-nx-based-autonomous-vision-systems\/\" >\n\t\t\t\tWhat Sensor Fusion Architecture Offers for NVIDIA Orin NX-Based Autonomous Vision Systems\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 6, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>This blog post was originally published at\u00a0e-con Systems\u2019 website. It is reprinted here with the permission of e-con Systems. Key Takeaways Why multi-sensor timing drift weakens edge AI perception How GNSS-disciplined clocks align cameras, LiDAR,<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/what-sensor-fusion-architecture-offers-for-nvidia-orin-nx-based-autonomous-vision-systems\/\" aria-label=\"Read more about What Sensor Fusion Architecture Offers for NVIDIA Orin NX-Based Autonomous Vision Systems\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<\/article>\n\t\t\t\t<\/div>\n\t\t\n\t\t\t\t<div class=\"e-load-more-anchor\" data-page=\"1\" data-max-page=\"505\" data-next-page=\"https:\/\/www.edge-ai-vision.com\/resources\/2\/\"><\/div>\n\t\t\t\t<nav class=\"elementor-pagination\" aria-label=\"Pagination\">\n\t\t\t<span class=\"page-numbers prev\">&laquo; Previous<\/span>\n<span aria-current=\"page\" class=\"page-numbers current\"><span class=\"elementor-screen-only\">Page<\/span>1<\/span>\n<a class=\"page-numbers\" href=\"https:\/\/www.edge-ai-vision.com\/resources\/2\/\"><span class=\"elementor-screen-only\">Page<\/span>2<\/a>\n<a class=\"page-numbers\" href=\"https:\/\/www.edge-ai-vision.com\/resources\/3\/\"><span class=\"elementor-screen-only\">Page<\/span>3<\/a>\n<a class=\"page-numbers\" href=\"https:\/\/www.edge-ai-vision.com\/resources\/4\/\"><span class=\"elementor-screen-only\">Page<\/span>4<\/a>\n<a class=\"page-numbers\" href=\"https:\/\/www.edge-ai-vision.com\/resources\/5\/\"><span class=\"elementor-screen-only\">Page<\/span>5<\/a>\n<a class=\"page-numbers next\" href=\"https:\/\/www.edge-ai-vision.com\/resources\/2\/\">Next &raquo;<\/a>\t\t<\/nav>\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section data-particle_enable=\"false\" data-particle-mobile-disabled=\"false\" class=\"elementor-section elementor-top-section elementor-element elementor-element-13f59482 elementor-section-stretched elementor-section-full_width elementor-section-height-default elementor-section-height-default\" data-id=\"13f59482\" data-element_type=\"section\" data-settings=\"{&quot;background_background&quot;:&quot;classic&quot;,&quot;shape_divider_top&quot;:&quot;opacity-tilt&quot;,&quot;shape_divider_bottom&quot;:&quot;opacity-tilt&quot;,&quot;stretch_section&quot;:&quot;section-stretched&quot;}\">\n\t\t\t\t\t\t\t<div class=\"elementor-background-overlay\"><\/div>\n\t\t\t\t\t\t<div class=\"elementor-shape elementor-shape-top\" aria-hidden=\"true\" data-negative=\"false\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 2600 131.1\" preserveAspectRatio=\"none\">\n\t<path class=\"elementor-shape-fill\" d=\"M0 0L2600 0 2600 69.1 0 0z\"\/>\n\t<path class=\"elementor-shape-fill\" style=\"opacity:0.5\" d=\"M0 0L2600 0 2600 69.1 0 69.1z\"\/>\n\t<path class=\"elementor-shape-fill\" style=\"opacity:0.25\" d=\"M2600 0L0 0 0 130.1 2600 69.1z\"\/>\n<\/svg>\t\t<\/div>\n\t\t\t\t<div class=\"elementor-shape elementor-shape-bottom\" aria-hidden=\"true\" data-negative=\"false\">\n\t\t\t<svg xmlns=\"http:\/\/www.w3.org\/2000\/svg\" viewBox=\"0 0 2600 131.1\" preserveAspectRatio=\"none\">\n\t<path class=\"elementor-shape-fill\" d=\"M0 0L2600 0 2600 69.1 0 0z\"\/>\n\t<path class=\"elementor-shape-fill\" style=\"opacity:0.5\" d=\"M0 0L2600 0 2600 69.1 0 69.1z\"\/>\n\t<path class=\"elementor-shape-fill\" style=\"opacity:0.25\" d=\"M2600 0L0 0 0 130.1 2600 69.1z\"\/>\n<\/svg>\t\t<\/div>\n\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-default\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-33 elementor-top-column elementor-element elementor-element-1261903e\" data-id=\"1261903e\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap\">\n\t\t\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t<div class=\"elementor-column elementor-col-33 elementor-top-column elementor-element elementor-element-41ae555b\" data-id=\"41ae555b\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap\">\n\t\t\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t<div class=\"elementor-column elementor-col-33 elementor-top-column elementor-element elementor-element-78e0b78c\" data-id=\"78e0b78c\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap\">\n\t\t\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<section data-particle_enable=\"false\" data-particle-mobile-disabled=\"false\" class=\"elementor-section elementor-top-section elementor-element elementor-element-26c8e6cf elementor-section-boxed elementor-section-height-default elementor-section-height-default\" data-id=\"26c8e6cf\" data-element_type=\"section\">\n\t\t\t\t\t\t<div class=\"elementor-container elementor-column-gap-wider\">\n\t\t\t\t\t<div class=\"elementor-column elementor-col-33 elementor-top-column elementor-element elementor-element-262da10e\" data-id=\"262da10e\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-312c2123 elementor-widget elementor-widget-heading\" data-id=\"312c2123\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Technologies<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-22635b13 joinNowBtn elementor-widget elementor-widget-button\" data-id=\"22635b13\" data-element_type=\"widget\" data-widget_type=\"button.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<div class=\"elementor-button-wrapper\">\n\t\t\t\t\t<a class=\"elementor-button elementor-button-link elementor-size-xl\" href=\"\/resources\/technologies\">\n\t\t\t\t\t\t<span class=\"elementor-button-content-wrapper\">\n\t\t\t\t\t\t\t\t\t<span class=\"elementor-button-text\">VIEW ALL<\/span>\n\t\t\t\t\t<\/span>\n\t\t\t\t\t<\/a>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-2141101d elementor-grid-1 elementor-grid-tablet-2 elementor-grid-mobile-1 elementor-posts--thumbnail-top elementor-posts--show-avatar elementor-card-shadow-yes elementor-posts__hover-gradient elementor-widget elementor-widget-posts\" data-id=\"2141101d\" data-element_type=\"widget\" data-settings=\"{&quot;cards_columns&quot;:&quot;1&quot;,&quot;cards_columns_tablet&quot;:&quot;2&quot;,&quot;cards_columns_mobile&quot;:&quot;1&quot;,&quot;cards_row_gap&quot;:{&quot;unit&quot;:&quot;px&quot;,&quot;size&quot;:35,&quot;sizes&quot;:[]},&quot;cards_row_gap_tablet&quot;:{&quot;unit&quot;:&quot;px&quot;,&quot;size&quot;:&quot;&quot;,&quot;sizes&quot;:[]},&quot;cards_row_gap_mobile&quot;:{&quot;unit&quot;:&quot;px&quot;,&quot;size&quot;:&quot;&quot;,&quot;sizes&quot;:[]}}\" data-widget_type=\"posts.cards\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-posts-container elementor-posts elementor-posts--skin-cards elementor-grid\" role=\"list\">\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56832 post type-post status-publish format-standard hentry category-memory category-videos\" role=\"listitem\">\n\t\t\t<div class=\"elementor-post__card\">\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/january-2026-dram-market-update\/\" >\n\t\t\t\tJanuary 2026 DRAM Market Update\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/january-2026-dram-market-update\/\" aria-label=\"Read more about January 2026 DRAM Market Update\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 14, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56804 post type-post status-publish format-standard has-post-thumbnail hentry category-blog category-e-con-systems category-sensors category-sony-electronics\" role=\"listitem\">\n\t\t\t<div class=\"elementor-post__card\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/sony-pregius-imx264-vs-imx568-a-detailed-sensor-comparison-guide\/\" tabindex=\"-1\" ><div class=\"elementor-post__thumbnail\"><img fetchpriority=\"high\" decoding=\"async\" width=\"300\" height=\"200\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Sony-Pregius-IMX264-vs.-IMX568-A-Detailed-Sensor-Comparison-Guide-300x200.jpg\" class=\"attachment-medium size-medium wp-image-56805\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Sony-Pregius-IMX264-vs.-IMX568-A-Detailed-Sensor-Comparison-Guide-300x200.jpg 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Sony-Pregius-IMX264-vs.-IMX568-A-Detailed-Sensor-Comparison-Guide-1024x683.jpg 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Sony-Pregius-IMX264-vs.-IMX568-A-Detailed-Sensor-Comparison-Guide-768x512.jpg 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Sony-Pregius-IMX264-vs.-IMX568-A-Detailed-Sensor-Comparison-Guide.jpg 1050w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div><\/a>\n\t\t\t\t<div class=\"elementor-post__badge\">Blog Posts<\/div>\n\t\t\t\t<div class=\"elementor-post__avatar\">\n\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/sony-pregius-imx264-vs-imx568-a-detailed-sensor-comparison-guide\/\" >\n\t\t\t\tSony Pregius IMX264 vs. IMX568: A Detailed Sensor Comparison Guide\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>This blog post was originally published at\u00a0e-con Systems\u2019 website. It is reprinted here with the permission of e-con Systems. The image sensor is an important component in defining the camera\u2019s image quality. Many real-world applications pushed for smaller pixel sizes to increase resolution in compact form factors. \u00a0To address this demand, Sony has been improving<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/sony-pregius-imx264-vs-imx568-a-detailed-sensor-comparison-guide\/\" aria-label=\"Read more about Sony Pregius IMX264 vs. IMX568: A Detailed Sensor Comparison Guide\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 13, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56822 post type-post status-publish format-standard has-post-thumbnail hentry category-mipi-alliance category-news category-sensors category-tools\" role=\"listitem\">\n\t\t\t<div class=\"elementor-post__card\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/upcoming-webinar-on-csi-2-over-d-phy-c-phy\/\" tabindex=\"-1\" ><div class=\"elementor-post__thumbnail\"><img decoding=\"async\" width=\"300\" height=\"200\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Webinar-CSI-2-Over-C-D-PHY-Feb2026-Website-LI-NoCTA-300x200.png\" class=\"attachment-medium size-medium wp-image-56823\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Webinar-CSI-2-Over-C-D-PHY-Feb2026-Website-LI-NoCTA-300x200.png 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Webinar-CSI-2-Over-C-D-PHY-Feb2026-Website-LI-NoCTA-768x512.png 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/Webinar-CSI-2-Over-C-D-PHY-Feb2026-Website-LI-NoCTA.png 1000w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div><\/a>\n\t\t\t\t<div class=\"elementor-post__badge\">MIPI Alliance<\/div>\n\t\t\t\t<div class=\"elementor-post__avatar\">\n\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/upcoming-webinar-on-csi-2-over-d-phy-c-phy\/\" >\n\t\t\t\tUpcoming Webinar on CSI-2 over D-PHY &#038; C-PHY\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>On February 24, 2026, at 9:00 am PST (12:00 pm EST) MIPI Alliance will deliver a webinar \u201cMIPI CSI-2 over D-PHY &amp; C-PHY: Advancing Imaging Conduit Solutions\u201d From the event page: MIPI CSI-2\u00ae, together with MIPI D-PHY\u2122 and C-PHY\u2122 physical layers, form the foundation of image sensor solutions across a wide range of markets, including<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/upcoming-webinar-on-csi-2-over-d-phy-c-phy\/\" aria-label=\"Read more about Upcoming Webinar on CSI-2 over D-PHY &#038; C-PHY\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 11, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/article>\n\t\t\t\t<\/div>\n\t\t\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t<div class=\"elementor-column elementor-col-33 elementor-top-column elementor-element elementor-element-6e2a641b\" data-id=\"6e2a641b\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-641ebac9 elementor-widget elementor-widget-heading\" data-id=\"641ebac9\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Applications<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-71b4af40 joinNowBtn elementor-widget elementor-widget-button\" data-id=\"71b4af40\" data-element_type=\"widget\" data-widget_type=\"button.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<div class=\"elementor-button-wrapper\">\n\t\t\t\t\t<a class=\"elementor-button elementor-button-link elementor-size-xl\" href=\"\/resources\/applications\">\n\t\t\t\t\t\t<span class=\"elementor-button-content-wrapper\">\n\t\t\t\t\t\t\t\t\t<span class=\"elementor-button-text\">VIEW ALL<\/span>\n\t\t\t\t\t<\/span>\n\t\t\t\t\t<\/a>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-142c7dd9 elementor-grid-1 elementor-grid-tablet-2 elementor-grid-mobile-1 elementor-posts--thumbnail-top elementor-posts--show-avatar elementor-card-shadow-yes elementor-posts__hover-gradient elementor-widget elementor-widget-posts\" data-id=\"142c7dd9\" data-element_type=\"widget\" data-settings=\"{&quot;cards_columns&quot;:&quot;1&quot;,&quot;cards_columns_tablet&quot;:&quot;2&quot;,&quot;cards_columns_mobile&quot;:&quot;1&quot;,&quot;cards_row_gap&quot;:{&quot;unit&quot;:&quot;px&quot;,&quot;size&quot;:35,&quot;sizes&quot;:[]},&quot;cards_row_gap_tablet&quot;:{&quot;unit&quot;:&quot;px&quot;,&quot;size&quot;:&quot;&quot;,&quot;sizes&quot;:[]},&quot;cards_row_gap_mobile&quot;:{&quot;unit&quot;:&quot;px&quot;,&quot;size&quot;:&quot;&quot;,&quot;sizes&quot;:[]}}\" data-widget_type=\"posts.cards\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-posts-container elementor-posts elementor-posts--skin-cards elementor-grid\" role=\"list\">\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56801 post type-post status-publish format-standard has-post-thumbnail hentry category-blog category-industrial-vision-computer-vision category-lincode\" role=\"listitem\">\n\t\t\t<div class=\"elementor-post__card\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/what-happens-when-the-inspection-ai-fails-learning-from-production-line-mistakes\/\" tabindex=\"-1\" ><div class=\"elementor-post__thumbnail\"><img decoding=\"async\" width=\"300\" height=\"157\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/What-Happens-When-the-Inspection-AI-Fails-Learning-from-Production-Line-Mistakes-300x157.png\" class=\"attachment-medium size-medium wp-image-56802\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/What-Happens-When-the-Inspection-AI-Fails-Learning-from-Production-Line-Mistakes-300x157.png 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/What-Happens-When-the-Inspection-AI-Fails-Learning-from-Production-Line-Mistakes-1024x536.png 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/What-Happens-When-the-Inspection-AI-Fails-Learning-from-Production-Line-Mistakes-768x402.png 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/02\/What-Happens-When-the-Inspection-AI-Fails-Learning-from-Production-Line-Mistakes.png 1200w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div><\/a>\n\t\t\t\t<div class=\"elementor-post__badge\">Blog Posts<\/div>\n\t\t\t\t<div class=\"elementor-post__avatar\">\n\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/what-happens-when-the-inspection-ai-fails-learning-from-production-line-mistakes\/\" >\n\t\t\t\tWhat Happens When the Inspection AI Fails: Learning from Production Line Mistakes\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>This blog post was originally published at\u00a0Lincode\u2019s website. It is reprinted here with the permission of Lincode. Studies show that\u00a0about 34% of manufacturing defects are missed because inspection systems make mistakes.[1] These numbers show a big problem\u2014when the inspection AI misses something, even a tiny defect can spread across hundreds or thousands of products. One<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/what-happens-when-the-inspection-ai-fails-learning-from-production-line-mistakes\/\" aria-label=\"Read more about What Happens When the Inspection AI Fails: Learning from Production Line Mistakes\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 12, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56795 post type-post status-publish format-standard has-post-thumbnail hentry category-automotive category-blog category-software category-texas-instruments category-tools\" role=\"listitem\">\n\t\t\t<div class=\"elementor-post__card\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/accelerating-next-generation-automotive-designs-with-the-tda5-virtualizer-development-kit\/\" tabindex=\"-1\" ><div class=\"elementor-post__thumbnail\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"168\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-300x168.jpg\" class=\"attachment-medium size-medium wp-image-12399\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-300x168.jpg 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-1024x576.jpg 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-768x432.jpg 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-1536x864.jpg 1536w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-2048x1152.jpg 2048w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments-500x281.jpg 500w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/logoheader_texasinstruments.jpg 1200w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div><\/a>\n\t\t\t\t<div class=\"elementor-post__badge\">Automotive<\/div>\n\t\t\t\t<div class=\"elementor-post__avatar\">\n\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/accelerating-next-generation-automotive-designs-with-the-tda5-virtualizer-development-kit\/\" >\n\t\t\t\tAccelerating next-generation automotive designs with the TDA5 Virtualizer\u2122 Development Kit\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>This blog post was originally published at\u00a0Texas Instruments\u2019 website. It is reprinted here with the permission of Texas Instruments. Introduction Continuous innovation in high-performance, power-efficient systems-on-a-chip (SoCs) is enabling safer, smarter and more autonomous driving experiences in even more vehicles. As another big step forward, Texas Instruments and Synopsys developed a\u00a0Virtualizer Development Kit\u2122 (VDK)\u00a0for the<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/accelerating-next-generation-automotive-designs-with-the-tda5-virtualizer-development-kit\/\" aria-label=\"Read more about Accelerating next-generation automotive designs with the TDA5 Virtualizer\u2122 Development Kit\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 10, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56608 post type-post status-publish format-standard has-post-thumbnail hentry category-automotive category-blog category-nvidia category-robotics category-software category-tools\" role=\"listitem\">\n\t\t\t<div class=\"elementor-post__card\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\" tabindex=\"-1\" ><div class=\"elementor-post__thumbnail\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"159\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai-300x159.png\" class=\"attachment-medium size-medium wp-image-56609\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai-300x159.png 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai-1024x544.png 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai-768x408.png 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2026\/01\/dec-ito-open-usd-halos-robotaxi-physical-ai.png 1280w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div><\/a>\n\t\t\t\t<div class=\"elementor-post__badge\">Automotive<\/div>\n\t\t\t\t<div class=\"elementor-post__avatar\">\n\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\" >\n\t\t\t\tInto the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>This blog post was originally published at NVIDIA\u2019s website. It is reprinted here with the permission of NVIDIA. NVIDIA Editor\u2019s note: This post is part of Into the Omniverse, a series focused on how developers, 3D practitioners and enterprises can transform their workflows using the latest advancements in OpenUSD and NVIDIA Omniverse. New NVIDIA safety<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2026\/02\/into-the-omniverse-openusd-and-nvidia-halos-accelerate-safety-for-robotaxis-physical-ai-systems\/\" aria-label=\"Read more about Into the Omniverse: OpenUSD and NVIDIA Halos Accelerate Safety for Robotaxis, Physical AI Systems\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tFebruary 9, 2026\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/article>\n\t\t\t\t<\/div>\n\t\t\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t<div class=\"elementor-column elementor-col-33 elementor-top-column elementor-element elementor-element-2d700858\" data-id=\"2d700858\" data-element_type=\"column\">\n\t\t\t<div class=\"elementor-widget-wrap elementor-element-populated\">\n\t\t\t\t\t\t<div class=\"elementor-element elementor-element-16f14348 elementor-widget elementor-widget-heading\" data-id=\"16f14348\" data-element_type=\"widget\" data-widget_type=\"heading.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t<h2 class=\"elementor-heading-title elementor-size-default\">Functions<\/h2>\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-5c9a991b joinNowBtn elementor-widget elementor-widget-button\" data-id=\"5c9a991b\" data-element_type=\"widget\" data-widget_type=\"button.default\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t\t\t<div class=\"elementor-button-wrapper\">\n\t\t\t\t\t<a class=\"elementor-button elementor-button-link elementor-size-xl\" href=\"\/resources\/functions\">\n\t\t\t\t\t\t<span class=\"elementor-button-content-wrapper\">\n\t\t\t\t\t\t\t\t\t<span class=\"elementor-button-text\">VIEW ALL<\/span>\n\t\t\t\t\t<\/span>\n\t\t\t\t\t<\/a>\n\t\t\t\t<\/div>\n\t\t\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-element elementor-element-7e3c609c elementor-grid-1 elementor-grid-tablet-2 elementor-grid-mobile-1 elementor-posts--thumbnail-top elementor-posts--show-avatar elementor-card-shadow-yes elementor-posts__hover-gradient elementor-widget elementor-widget-posts\" data-id=\"7e3c609c\" data-element_type=\"widget\" data-settings=\"{&quot;cards_columns&quot;:&quot;1&quot;,&quot;cards_columns_tablet&quot;:&quot;2&quot;,&quot;cards_columns_mobile&quot;:&quot;1&quot;,&quot;cards_row_gap&quot;:{&quot;unit&quot;:&quot;px&quot;,&quot;size&quot;:35,&quot;sizes&quot;:[]},&quot;cards_row_gap_tablet&quot;:{&quot;unit&quot;:&quot;px&quot;,&quot;size&quot;:&quot;&quot;,&quot;sizes&quot;:[]},&quot;cards_row_gap_mobile&quot;:{&quot;unit&quot;:&quot;px&quot;,&quot;size&quot;:&quot;&quot;,&quot;sizes&quot;:[]}}\" data-widget_type=\"posts.cards\">\n\t\t\t\t<div class=\"elementor-widget-container\">\n\t\t\t\t\t\t\t<div class=\"elementor-posts-container elementor-posts elementor-posts--skin-cards elementor-grid\" role=\"list\">\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56246 post type-post status-publish format-standard has-post-thumbnail hentry category-algorithms-and-models category-blog category-entertainment category-industrial-vision-computer-vision category-information-access-and-analytics category-object-identification\" role=\"listitem\">\n\t\t\t<div class=\"elementor-post__card\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2025\/12\/ai-on-3-ways-to-bring-agentic-ai-to-computer-vision-applications\/\" tabindex=\"-1\" ><div class=\"elementor-post__thumbnail\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"169\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/12\/image2-300x169.png\" class=\"attachment-medium size-medium wp-image-56247\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/12\/image2-300x169.png 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/12\/image2-1024x577.png 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/12\/image2-768x433.png 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/12\/image2-1536x865.png 1536w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/12\/image2.png 1999w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div><\/a>\n\t\t\t\t<div class=\"elementor-post__badge\">Algorithms &amp; Models<\/div>\n\t\t\t\t<div class=\"elementor-post__avatar\">\n\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2025\/12\/ai-on-3-ways-to-bring-agentic-ai-to-computer-vision-applications\/\" >\n\t\t\t\tAI On: 3 Ways to Bring Agentic AI to Computer Vision Applications\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>This blog post was originally published at\u00a0NVIDIA\u2019s website. It is reprinted here with the permission of NVIDIA. Learn how to integrate vision language models into video analytics applications, from AI-powered search to fully automated video analysis. Today\u2019s\u00a0computer vision\u00a0systems excel at identifying what happens in physical spaces and processes, but lack the abilities to explain the<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2025\/12\/ai-on-3-ways-to-bring-agentic-ai-to-computer-vision-applications\/\" aria-label=\"Read more about AI On: 3 Ways to Bring Agentic AI to Computer Vision Applications\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tDecember 16, 2025\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-56058 post type-post status-publish format-standard has-post-thumbnail hentry category-algorithms-and-models category-articles category-multimodal category-object-identification category-object-tracking category-tools\" role=\"listitem\">\n\t\t\t<div class=\"elementor-post__card\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2025\/11\/sam3-a-new-era-for-open%e2%80%91vocabulary-segmentation-and-edge-ai\/\" tabindex=\"-1\" ><div class=\"elementor-post__thumbnail\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"171\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/11\/image-300x171.png\" class=\"attachment-medium size-medium wp-image-56059\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/11\/image-300x171.png 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/11\/image-1024x585.png 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/11\/image-768x439.png 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/11\/image.png 1344w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div><\/a>\n\t\t\t\t<div class=\"elementor-post__badge\">Algorithms &amp; Models<\/div>\n\t\t\t\t<div class=\"elementor-post__avatar\">\n\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2025\/11\/sam3-a-new-era-for-open%e2%80%91vocabulary-segmentation-and-edge-ai\/\" >\n\t\t\t\tSAM3: A New Era for Open\u2011Vocabulary Segmentation and Edge AI\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>Quality training data \u2013 especially segmented visual data \u2013 is a cornerstone of building robust vision models. Meta\u2019s recently announced Segment Anything Model 3 (SAM3) arrives as a potential game-changer in this domain. SAM3 is a unified model that can detect, segment, and even track objects in images and videos using both text and visual<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2025\/11\/sam3-a-new-era-for-open%e2%80%91vocabulary-segmentation-and-edge-ai\/\" aria-label=\"Read more about SAM3: A New Era for Open\u2011Vocabulary Segmentation and Edge AI\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tNovember 24, 2025\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/article>\n\t\t\t\t<article class=\"elementor-post elementor-grid-item post-55790 post type-post status-publish format-standard has-post-thumbnail hentry category-biometrics category-blog category-e-con-systems category-industrial-vision-computer-vision category-medical category-sensors category-uncategorized\" role=\"listitem\">\n\t\t\t<div class=\"elementor-post__card\">\n\t\t\t\t<a class=\"elementor-post__thumbnail__link\" href=\"https:\/\/www.edge-ai-vision.com\/2025\/11\/tlens-vs-vcm-autofocus-technology\/\" tabindex=\"-1\" ><div class=\"elementor-post__thumbnail\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"200\" src=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/11\/TLens-vs-VCM-Autofocus-Technology-300x200.jpg\" class=\"attachment-medium size-medium wp-image-55792\" alt=\"\" srcset=\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/11\/TLens-vs-VCM-Autofocus-Technology-300x200.jpg 300w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/11\/TLens-vs-VCM-Autofocus-Technology-1024x683.jpg 1024w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/11\/TLens-vs-VCM-Autofocus-Technology-768x512.jpg 768w, https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2025\/11\/TLens-vs-VCM-Autofocus-Technology.jpg 1050w\" sizes=\"(max-width: 300px) 100vw, 300px\" \/><\/div><\/a>\n\t\t\t\t<div class=\"elementor-post__badge\">Biometrics<\/div>\n\t\t\t\t<div class=\"elementor-post__avatar\">\n\t\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__text\">\n\t\t\t\t<h3 class=\"elementor-post__title\">\n\t\t\t<a href=\"https:\/\/www.edge-ai-vision.com\/2025\/11\/tlens-vs-vcm-autofocus-technology\/\" >\n\t\t\t\tTLens vs VCM Autofocus Technology\t\t\t<\/a>\n\t\t<\/h3>\n\t\t\t\t<div class=\"elementor-post__excerpt\">\n\t\t\t<p>This blog post was originally published at\u00a0e-con Systems\u2019 website. It is reprinted here with the permission of e-con Systems. In this blog, we\u2019ll walk you through how TLens technology differs from traditional VCM autofocus, how TLens combined with e-con Systems\u2019 Tinte ISP enhances camera performance, key advantages of TLens over mechanical autofocus systems, and applications<\/p>\n\t\t<\/div>\n\t\t\n\t\t<a class=\"elementor-post__read-more\" href=\"https:\/\/www.edge-ai-vision.com\/2025\/11\/tlens-vs-vcm-autofocus-technology\/\" aria-label=\"Read more about TLens vs VCM Autofocus Technology\" tabindex=\"-1\" >\n\t\t\tRead More +\t\t<\/a>\n\n\t\t\t\t<\/div>\n\t\t\t\t<div class=\"elementor-post__meta-data\">\n\t\t\t\t\t<span class=\"elementor-post-date\">\n\t\t\tNovember 5, 2025\t\t<\/span>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/article>\n\t\t\t\t<\/div>\n\t\t\n\t\t\t\t\t\t<\/div>\n\t\t\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/div>\n\t\t\t\t\t<\/div>\n\t\t<\/section>\n\t\t\t\t<\/div>\n\t\t","protected":false},"excerpt":{"rendered":"<p>Resources In-depth information about the edge AI and vision applications, technologies, products, markets and trends. The content in this section of the website comes from Edge AI and Vision Alliance members and other industry luminaries. TECHNOLOGIES APPLICATIONS FUNCTIONS All Resources Technologies VIEW ALL Applications VIEW ALL Functions VIEW ALL<\/p>\n","protected":false},"author":1,"featured_media":0,"parent":0,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"","meta":{"content-type":"","_uag_custom_page_level_css":"","site-sidebar-layout":"no-sidebar","site-content-layout":"page-builder","ast-site-content-layout":"default","site-content-style":"default","site-sidebar-style":"default","ast-global-header-display":"","ast-banner-title-visibility":"","ast-main-header-display":"","ast-hfb-above-header-display":"","ast-hfb-below-header-display":"","ast-hfb-mobile-header-display":"","site-post-title":"disabled","ast-breadcrumbs-content":"","ast-featured-img":"disabled","footer-sml-layout":"","ast-disable-related-posts":"","theme-transparent-header-meta":"","adv-header-id-meta":"","stick-header-meta":"","header-above-stick-meta":null,"header-main-stick-meta":"","header-below-stick-meta":"","astra-migrate-meta-layouts":"default","ast-page-background-enabled":"default","ast-page-background-meta":{"desktop":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"ast-content-background-meta":{"desktop":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"tablet":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""},"mobile":{"background-color":"var(--ast-global-color-5)","background-image":"","background-repeat":"repeat","background-position":"center center","background-size":"auto","background-attachment":"scroll","background-type":"","background-media":"","overlay-type":"","overlay-color":"","overlay-opacity":"","overlay-gradient":""}},"footnotes":""},"summityear":[],"class_list":["post-23948","page","type-page","status-publish","hentry"],"yoast_head":"<!-- This site is optimized with the Yoast SEO plugin v26.8 - https:\/\/yoast.com\/product\/yoast-seo-wordpress\/ -->\n<title>Resources - Edge AI and Vision Alliance<\/title>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.edge-ai-vision.com\/resources\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Resources - Edge AI and Vision Alliance\" \/>\n<meta property=\"og:description\" content=\"Resources In-depth information about the edge AI and vision applications, technologies, products, markets and trends. The content in this section of the website comes from Edge AI and Vision Alliance members and other industry luminaries. TECHNOLOGIES APPLICATIONS FUNCTIONS All Resources Technologies VIEW ALL Applications VIEW ALL Functions VIEW ALL\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.edge-ai-vision.com\/resources\/\" \/>\n<meta property=\"og:site_name\" content=\"Edge AI and Vision Alliance\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/EdgeAIVision\/\" \/>\n<meta property=\"article:modified_time\" content=\"2023-08-24T20:18:16+00:00\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:site\" content=\"@edgeaivision\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"9 minutes\" \/>\n<script type=\"application\/ld+json\" class=\"yoast-schema-graph\">{\"@context\":\"https:\/\/schema.org\",\"@graph\":[{\"@type\":\"WebPage\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/resources\/\",\"url\":\"https:\/\/www.edge-ai-vision.com\/resources\/\",\"name\":\"Resources - Edge AI and Vision Alliance\",\"isPartOf\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/#website\"},\"datePublished\":\"2020-02-10T23:33:42+00:00\",\"dateModified\":\"2023-08-24T20:18:16+00:00\",\"breadcrumb\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/resources\/#breadcrumb\"},\"inLanguage\":\"en-US\",\"potentialAction\":[{\"@type\":\"ReadAction\",\"target\":[\"https:\/\/www.edge-ai-vision.com\/resources\/\"]}]},{\"@type\":\"BreadcrumbList\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/resources\/#breadcrumb\",\"itemListElement\":[{\"@type\":\"ListItem\",\"position\":1,\"name\":\"Home\",\"item\":\"https:\/\/www.edge-ai-vision.com\/\"},{\"@type\":\"ListItem\",\"position\":2,\"name\":\"Resources\"}]},{\"@type\":\"WebSite\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/#website\",\"url\":\"https:\/\/www.edge-ai-vision.com\/\",\"name\":\"Edge AI and Vision Alliance\",\"description\":\"Designing machines that perceive and understand.\",\"publisher\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/#organization\"},\"potentialAction\":[{\"@type\":\"SearchAction\",\"target\":{\"@type\":\"EntryPoint\",\"urlTemplate\":\"https:\/\/www.edge-ai-vision.com\/?s={search_term_string}\"},\"query-input\":{\"@type\":\"PropertyValueSpecification\",\"valueRequired\":true,\"valueName\":\"search_term_string\"}}],\"inLanguage\":\"en-US\"},{\"@type\":\"Organization\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/#organization\",\"name\":\"Edge AI and Vision Alliance\",\"url\":\"https:\/\/www.edge-ai-vision.com\/\",\"logo\":{\"@type\":\"ImageObject\",\"inLanguage\":\"en-US\",\"@id\":\"https:\/\/www.edge-ai-vision.com\/#\/schema\/logo\/image\/\",\"url\":\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/1200x675header_edgeai_vision.jpg\",\"contentUrl\":\"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/1200x675header_edgeai_vision.jpg\",\"width\":1200,\"height\":675,\"caption\":\"Edge AI and Vision Alliance\"},\"image\":{\"@id\":\"https:\/\/www.edge-ai-vision.com\/#\/schema\/logo\/image\/\"},\"sameAs\":[\"https:\/\/www.facebook.com\/EdgeAIVision\/\",\"https:\/\/x.com\/edgeaivision\",\"https:\/\/www.linkedin.com\/company\/edgeaivision\/\",\"http:\/\/www.youtube.com\/embeddedvision\"]}]}<\/script>\n<!-- \/ Yoast SEO plugin. -->","yoast_head_json":{"title":"Resources - Edge AI and Vision Alliance","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.edge-ai-vision.com\/resources\/","og_locale":"en_US","og_type":"article","og_title":"Resources - Edge AI and Vision Alliance","og_description":"Resources In-depth information about the edge AI and vision applications, technologies, products, markets and trends. The content in this section of the website comes from Edge AI and Vision Alliance members and other industry luminaries. TECHNOLOGIES APPLICATIONS FUNCTIONS All Resources Technologies VIEW ALL Applications VIEW ALL Functions VIEW ALL","og_url":"https:\/\/www.edge-ai-vision.com\/resources\/","og_site_name":"Edge AI and Vision Alliance","article_publisher":"https:\/\/www.facebook.com\/EdgeAIVision\/","article_modified_time":"2023-08-24T20:18:16+00:00","twitter_card":"summary_large_image","twitter_site":"@edgeaivision","twitter_misc":{"Est. reading time":"9 minutes"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.edge-ai-vision.com\/resources\/","url":"https:\/\/www.edge-ai-vision.com\/resources\/","name":"Resources - Edge AI and Vision Alliance","isPartOf":{"@id":"https:\/\/www.edge-ai-vision.com\/#website"},"datePublished":"2020-02-10T23:33:42+00:00","dateModified":"2023-08-24T20:18:16+00:00","breadcrumb":{"@id":"https:\/\/www.edge-ai-vision.com\/resources\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.edge-ai-vision.com\/resources\/"]}]},{"@type":"BreadcrumbList","@id":"https:\/\/www.edge-ai-vision.com\/resources\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.edge-ai-vision.com\/"},{"@type":"ListItem","position":2,"name":"Resources"}]},{"@type":"WebSite","@id":"https:\/\/www.edge-ai-vision.com\/#website","url":"https:\/\/www.edge-ai-vision.com\/","name":"Edge AI and Vision Alliance","description":"Designing machines that perceive and understand.","publisher":{"@id":"https:\/\/www.edge-ai-vision.com\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.edge-ai-vision.com\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.edge-ai-vision.com\/#organization","name":"Edge AI and Vision Alliance","url":"https:\/\/www.edge-ai-vision.com\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.edge-ai-vision.com\/#\/schema\/logo\/image\/","url":"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/1200x675header_edgeai_vision.jpg","contentUrl":"https:\/\/www.edge-ai-vision.com\/wp-content\/uploads\/2020\/01\/1200x675header_edgeai_vision.jpg","width":1200,"height":675,"caption":"Edge AI and Vision Alliance"},"image":{"@id":"https:\/\/www.edge-ai-vision.com\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/EdgeAIVision\/","https:\/\/x.com\/edgeaivision","https:\/\/www.linkedin.com\/company\/edgeaivision\/","http:\/\/www.youtube.com\/embeddedvision"]}]}},"uagb_featured_image_src":{"full":false,"thumbnail":false,"medium":false,"medium_large":false,"large":false,"1536x1536":false,"2048x2048":false},"uagb_author_info":{"display_name":"wordpress","author_link":"https:\/\/www.edge-ai-vision.com\/author\/wordpress\/"},"uagb_comment_info":0,"uagb_excerpt":"Resources In-depth information about the edge AI and vision applications, technologies, products, markets and trends. The content in this section of the website comes from Edge AI and Vision Alliance members and other industry luminaries. TECHNOLOGIES APPLICATIONS FUNCTIONS All Resources Technologies VIEW ALL Applications VIEW ALL Functions VIEW ALL","_links":{"self":[{"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/pages\/23948","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/comments?post=23948"}],"version-history":[{"count":7,"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/pages\/23948\/revisions"}],"predecessor-version":[{"id":43287,"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/pages\/23948\/revisions\/43287"}],"wp:attachment":[{"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/media?parent=23948"}],"wp:term":[{"taxonomy":"summityear","embeddable":true,"href":"https:\/\/www.edge-ai-vision.com\/wp-json\/wp\/v2\/summityear?post=23948"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}