NEWS: 公告在东京证券交易所JASDAQ标准市场新上市

表纸
市场调查报告书

新兴图像传感器技术(2021-2031):应用和市场

Emerging Image Sensor Technologies 2021-2031: Applications and Markets

出版商 IDTechEx Ltd. 商品编码 1015123
出版日期 内容资讯 英文 307 Slides
商品交期: 最快1-2个工作天内
价格
新兴图像传感器技术(2021-2031):应用和市场 Emerging Image Sensor Technologies 2021-2031: Applications and Markets
出版日期: 2021年06月23日内容资讯: 英文 307 Slides
简介

标题
新兴图像传感器技术(2021-2031):应用和市场
用于自动驾驶汽车、无人机、精准农业和工业自动化的创新图像传感器。包括有机光电探测器、短波红外图像传感器、基于事件的视觉、高光谱成像、靈活的 X 射线探测器和波前成像。

"到 2031 年,自主技术将引领新兴图像传感器市场达到 3.6 亿美元。"

图像传感是一项非常重要的功能,用于从网络摄像头和智能手机摄像头到自动驾驶汽车和工业检测等各种应用。IDTechEx 的这份报告全面探索了新兴图像传感器的市场,涵盖了从薄膜柔性光电探测器到基于事件的视觉的各种技术。

虽然用于可见光的传统 CMOS 检测器已经成熟且有些商品化,至少对于低价值应用而言,但对于更复杂的图像传感器存在广泛的机会,这些传感器提供的功能超出了简单地获取红色、绿色和蓝色 (RGB ) 强度值。因此,目前正致力于开发新兴的图像传感器技术,这些技术可以检测超出人类视觉范围的光的各个方面。这包括在更宽的光谱范围、更大的区域上成像、在每个像素处获取光谱数据,以及同时提高时间分辨率和动态范围。

这种机会统一在很大程度上源于机器视觉的日益普及,其中图像分析由计算算法执行。机器学习需要尽可能多的输入数据来建立有助于物体识别和分类的相关性,因此获取不同波长范围内的光学信息(例如具有光谱分辨率)是非常有利的。

当然,新兴的图像传感器技术提供了许多其他好处。根据技术的不同,这可以包括成本更低的类似功能、增加的动态范围、提高时间分辨率、空间可变靈敏度、高分辨率的全局快门、减少散射的有害影响、靈活性/适形性等。一个特别重要的趋势是开发用于在短波红外(SWIR,1000 - 2000 nm)光谱区域成像的非常昂贵的 InGaAs 传感器的更便宜的替代品,这将使这种能力扩展到更广泛的r 范围应用程序。这包括自动驾驶汽车,其中短波红外成像有助于区分在可见光谱中看起来相似的物体/材料,同时还减少灰尘和雾气的散射。

该报告涵盖以下技术:

  • 矽混合图像传感器上的量子点
  • 矽混合图像传感器上的有机光电探测器
  • 新兴的 SWIR 图像传感器技术
  • 有机和钙钛矿光电二极管(OPD 和 PPD)
  • 基于事件的视觉
  • 高光谱成像
  • 靈活的 X 射线图像传感器
  • 波前成像
  • 混合图像传感器。在 CMOS 读出电路顶部添加额外的光吸收层是一种混合方法,它利用有机半导体或量子点来增加对 SWIR 区域的光谱靈敏度。目前由昂贵的 InGaAs 传感器主导,这项新技术有望大幅降低价格,从而将 SWIR 成像用于自动驾驶汽车等新应用。
  • 扩展范围的矽。鉴于 InGaAs 传感器的价格非常高,有相当大的动机开发成本低得多的替代品,可以检测 SWIR 光谱区域低端的光。这种 SWIR 传感器随后可用于车辆,以减少散射,从而在雾和灰尘中提供更好的视野。
  • 薄膜光电探测器。 在大面积上检测光,而不是在单个小探测器上,对于获取生物特征数据和皮肤成像(如果靈活)是非常理想的。目前,矽的高成本意味著大面积图像传感器可能非常昂贵。然而,利用溶液可加工半导体的新兴方法提供了一种生产大面积共形光电探测器的引人注目的方法。印刷有机光电探测器 (OPD) 是最发达的方法,正在积极探索显示屏下指纹检测。
  • 基于事件的视觉:自动驾驶汽车、无人机和高速工业应用需要具有高时间分辨率的图像传感。然而,对于传统的基于帧的成像,高时间分辨率会产生大量需要计算密集型处理的数据。基于事件的视觉,也称为动态视觉传感 (DVS),是解决这一挑战的新兴技术。这是一种获取光学信息的全新思维方式,其中每个传感器像素报告与强度变化相对应的时间戳。因此,基于事件的视觉可以将快速变化的图像区域的更高时间分辨率与大大减少的数据传输和后续处理要求相结合。
  • 高光谱成像:从入射光中获取尽可能多的信息对于需要物体识别的应用非常有利,因为分类算法需要处理更多数据。高光谱成像是一种相对成熟的技术,它在每个像素处获取完整的光谱,以使用色散光学元件和图像传感器生成(x, y, &lamb da;) 数据立方体,是一项相对成熟的技术,已在精准农业和工业过程检验。然而,目前大多数高光谱相机都采用线性电子扫瞄原理,而 SWIR 高光谱成像由于 InGaAs 传感器成本高而仅限于相对小众的应用。新兴技术似乎将破坏这两个方面,快照成像提供了线扫瞄相机的替代方案,并且上述新的 SWIR 传感技术有助于降低成本并被更广泛的应用采用。
  • 靈活的 X 射线传感器:X 射线传感器非常成熟,对于医疗和安全应用非常重要。然而,聚焦 X 射线的困难意味著传感器需要覆盖大面积。此外,由于矽不能有效地吸收 X 射线,因此通常使用闪烁体层。然而,这两个方面都增加了传感器的尺寸和重量,使 X 射线探测器体积庞大且笨重。基于非晶矽背板的柔性 X 射线传感器提供了一种引人注目的替代方案,因为它们更轻且保形(尤其适用于对弯曲的身体部位进行成像)。展望未来,基于解决方案可加工半导体的直接 X 射线传感器可减轻重量和复杂性,并具有提高空间分辨率的潜力。
    < li>波前成像: 波前(或相位)成像能够从传统传感器丢失的入射光中提取相位信息。该技术目前用于小众应用,例如光学组件设计/检查和眼科。然而,最近的进展导致了分辨率的显著改进,这将使这项技术得到更广泛的应用。生物成像是更有前途的新兴应用之一,其中收集相位和强度减少了散射的影响,从而实现了更好定义的图像。

总而言之,越来越多地采用计算图像分析为图像传感技术提供了巨大的机会,这些技术提供了超越传统 CMOS 传感器的功能。本报告为市场提供了一个全面的概述新兴图像传感器技术和相关技术的发展,涵盖从自治范围多种应用辆icles工业质量控制。预计在未来十年内,这些激动人心的创新成像技术将被迅速采用。

报告中包含以下信息:

  • 执行摘要和结论。
  • 对上述新兴图像传感器技术的详细技术分析。
  • 高度精细的 10 年市场预测,按技术划分,随后按应用划分。这包括 40 多个单独的预测类别。预测以数量和收入来表示。
  • 技术/商业准备情况评估,按技术和应用划分。
  • 开发和采用各种新兴图像传感技术的商业动机。
  • 针对每种图像传感技术的多个应用案例研究。
  • 每种图像传感技术的 SWOT 分析。
  • 每个技术类别中的主要参与者概览。
  • 超过 25 份公司简介,其中大部分基于近期的主要采访。其中包括对当前状态、技术、潜在市场和商业模式的讨论,以及公司财务信息(如已披露)和我们的 SWOT 分析。
  • 精选的与新兴图像传感器技术相关的学术研究亮点。

来自 IDTechEx 的分析师访问

所有报告购买都包括与专家分析师最多 30 分钟的电话时间,他将帮助您将报告中的关键发现与您正在解决的业务问题联系起来。这需要在购买报告后的三个月内使用。

目录

1. 执行摘要

  • 1.1. 关键要点
  • 1.2. 传统图像传感器:市场概览
  • 1.3. 短波红外 (SWIR) 成像的动机
  • 1.4. SWIR 成像:现有和新兴技术选项
  • 1.5。SWIR 图像传感器的机遇
  • 1.6. SWIR 传感器:应用概述
  • 1.7. OPD-on-CMOS 混合图像传感器
  • 1.8。量子点作□□为光学传感器材料
  • 1.9. QD/OPD-on-CMOS 探测器的前景
  • 1.10. 用于 SWIR 成像的 QD-Si 技术面临的挑战
  • 1.11. 薄膜有机和钙钛矿光电探测器概述
  • 1.12。有机光电探测器的应用。
  • 1.13. 高光谱成像简介
  • 1.14. 高光谱成像概述
  • 1.15。什么是基于事件的视觉?
  • 1.16。基于事件的视觉应用前景广阔
  • 1.17. 基于事件的视觉概述
  • 1.18. 波前成像概述
  • 1.19. 靈活且直接的 X 射线图像传感器概述
  • 1.20。新兴图像传感器技术的 10 年市场预测
  • 1.21. 新兴图像传感器技术的 10 年市场预测(按数量)
  • 1.22。新兴图像传感器技术的 10 年市场预测(按数量、数据表)
  • 1.23。新兴图像传感器技术的 10 年市场预测(按收入)
  • 1.24。新兴图像传感器技术的 10 年市场预测(按收入、数据表)

2. 简介

  • 2 .1. 什么是传感器?
  • 2.2. 传感器价值链示例:数码相机
  • 2.3. 光电探测器工作原理
  • 2.4. 量化光电探测器和图像传感器的性能
  • 2.5. 从光中提取尽可能多的信息
  • 2.6。自动驾驶汽车将需要机器视觉
  • 2.7. 自动驾驶汽车采用趋势
  • 2.8. 汽车的自动化程度如何?
  • 2.9. 全球自动驾驶汽车市场
  • 2.10. 不同的汽车自主级别需要多少个摄像头
  • 2.11. 不断增长的无人机使用为新兴的图像传感器提供了广阔的市场
  • 2.12. 无人机所需的新兴图像传感器

3. 市场预测

  • 3.1. 市场预测方法
  • 3.2. 参数化预测曲线
  • 3.3. 确定总可寻址市场 (TAM)
  • 3.4. 确定收入
  • 3.5。10 年短波红外 (SWIR) 图像传感器市场预测:按数量
  • 3.6. 10 年混合 OPD-on-CMOS 图像传感器市场预测:按数量
  • 3.7. 10 年混合 OPD-on-CMOS 图像传感器市场预测:按收入
  • 3.8. 10 年混合 QD-on-CMOS 图像传感器市场预测:按数量
  • 3.9. 10 年混合 QD-on-CMOS 图像传感器市场预测:按收入
  • 3.10. 10 年薄膜有机和钙钛矿光电探测器(OPD 和 PPD)市场预测:按数量
  • 3.11. 10 年薄膜有机和钙钛矿光电探测器(OPD 和 PPD)市场预测:按收入
  • 3.12. 10 年高光谱成像市场预测:按数量
  • 3.13. 10 年高光谱成像市场预测:按收入
  • 3.14. 10 年基于事件的视觉市场预测:按数量
  • 3.15。10 年基于事件的视觉市场预测:按收入
  • 3.16。10 年ar 波前成像市场预测:按数量
  • 3.17。10 年波前成像市场预测:按收入
  • 3.18. 10 年柔性 X 射线图像传感器市场预测:按数量

4. 已建立的可见光图像传感器(CCD 和 CMOS)简要概述

  • 4.1. 传统图像传感器:市场概览
  • 4.2. CMOS 图像传感器 (CIS) 中的关键组件
  • 4.3. 传感器架构:正面和背面照明
  • 4.4. 背照式 CMO S 图像传感器的工艺流程
  • 4.5。比较 CMOS 和 CCD 图像传感器
  • 4.6. 全局而不是滚动百叶窗的好处

5. 短波红外(SWIR)图像传感器

  • 5.1. 短波红外 (SWIR) 成像的动机
  • 5.2. SWIR 成像减少光散射
  • 5.3. SWIR:现有和新兴技术选项
  • 5.4. SWIR 成像的应用
    • 5.4.1. SWIR 成像的应用
    • 5.4.2. 确定含水量SWIR意马吴
    • 5.4.3. 用于自主移动的 SWIR
    • 5.4.4. SWIR 成像可实现更好的危险检测
    • 5.4.5。SWIR 可通过矽晶片进行成像
    • 5.4.6. SWIR光成像温度
    • 5.4.7。工业检验期间异物的可视化
    • 5.4.8。光谱化学传感器
    • 5.4.9。用于工业过程优化的 SWIR 图像传感
    • 5.4.10。MULTIPLE(欧盟项目):重点领域、目标和参与者
    • 5.4.11。短波红外光谱:可穿戴应用
    • 5.4.12。SWIR 光谱:通过可穿戴技术确定水和体温
    • 5.4.13。SWIR 光谱:酒精检测
    • 5.4.14。用于高光谱成像的 SWIR 图像传感器
    • 5. 4.15. SWIR 传感器:应用概述
    • 5.4.16。SWIR应用要求
  • 5.5。InGaAs 传感器 - SWIR 成像的现有技术
    • 5.5.1. 现有长波长检测:InGaAs
    • 5.5.2. 高分辨率、低成本红外传感器的挑战
    • 5.5.3. InGaAs 传感器设计:焊料凸点限制分辨率
    • 5.5.4。索尼提高 InGaAs 传感器分辨率和光谱范围
  • 5.6. 新兴无机 SWIR 技术和参与者
    • 5.6.1. Trieye:创新的基于矽的 SWIR 传感器
    • 5.6.2. OmniVision:使矽 CMOS 对 NIR 敏感 (II)
    • 5.6.3. SWOT 分析:SWIR 图像传感器(非混合、非 InGaAs)
    • 5.6.4。供应商概览:新兴的 SWIR 图像传感器
    • 5.6.5。公司简介:SWIR 成像(不包括混合方法)

6. 混合 OPD-ON-CMOS 图像传感器(包括 SWIR)

  • 6.1. OPD-on-CMOS 混合图像传感器
  • 6.2. 杂化有机/CMOS蟾酥□
  • 6.3. 用于广播级摄像机的混合有机/CMOS 传感器
  • 6.4. 比较混合有机/CMOS 传感器与背照式 CMOS 传感器
  • 6.5。仅使用矽技术的 CMOS 全局快门的进展
  • 6.6. Fraunhofer FEP:SWIR OP D-on-CMOS 传感器 (I)
  • 6.7. Fraunhofer FEP:SWIR OPD-on-CMOS 传感器 (II)
  • 6.8. 学术研究:对较长波长红外光敏感的扭曲双层石墨烯
  • 6.9. OPD-on-CMOS 探测器的技术准备水平(按应用) <我>6.10。OPD-on-CMOS 图像传感器的 SWOT 分析
  • 6.11. 供应商概览:OPD-on-CMOS 混合图像传感器
  • 6.12. 公司简介:OPD-on-CMOS

7. 混合 QD-ON-CMOS 图像传感器

  • 7.1. 量子点作□□为光学传感器材料
  • 7.2. 硫化铅作为量子点
  • 7.3. 量子点:材料系统的选择
  • 7.4. 量子点在图像传感器中的应用和挑战
  • 7.5。图像传感器中的 QD 层优势 (I):提高传感器靈敏度和增益
  • 7.6. QD-Si 混合图像传感器(II):减少厚度
  • 7.7. 检测基准(一)
  • 7.8。检测基准(二)
  • 7.9。带有全局快门的混合 QD-on-CMOS,用于 SWIR 成像。
  • 7.10。QD-Si 混合图像传感器:实现高分辨率全局快门
  • 7.11。QD-Si 混合图像传感器(IV):机器视觉结构光检测的低功耗和高靈敏度?
  • 7.12。QD层是如何应用的?
  • 7.13。解决方案加工的优势:易于与 ROIC CMOS 集成?
  • 7.14。QD 光学层:提高 QD 薄膜导电性的方法
  • 7.15。量子点:涵盖从可见光到近红外的范围
  • 7.16。PbS QD、Si、聚合物、InGaAs、HgCdTe 等的SWIR 靈敏度...
  • 7.17。混合 QD-on-CMOS 图像传感器:处理
    • 7.17.1。QD-on-CMOS 的价值炼和生产步驟
    • 7.17.2. 解决方案处理的优势:易于与 CMOS ROIC 集成?
    • 7.17.3。量子点薄膜:加工挑战
    • 7.17.4。带有石墨烯中间层的 QD-on-CMOS
    • 7.17.5。用于 SWIR 成像的 QD-Si 技术面临的挑战
    • 7.17.6。QD-on-CMOS 传感器:持续的技术挑战
    • 7.17.7。QD-on-CMOS 探测器的技术准备水平(按应用)
  • 7.18。混合 QD-on-CMOS 图像传感器:主要参与者
    • 7.18.1。SWIR 视觉系统:用于 SWIR 成像的混合量子点
    • 7.18.2. SWIR视觉传感器:一是商业QD-CMOS摄像头小号
    • 7.18.3。IMEC:QD-on-CMOS 集成示例(一)
    • 7.18.4。IMEC:QD-on-CMOS 集成示例(二)
    • 7.18.5。RTI International:QD-on-CMOS 集成示例
    • 7.18.6。QD-on-CMOS 集成示例(ICFO 续) <李>7.18.7。Emberion:QD-石墨烯短波红外传感器
    • 7.18.8。Emberion:QD-Graphene-Si 宽范围 SWIR 传感器
    • 7.18.9。Emberion:具有 400 至 2000 nm 光谱范围的 VIS-SWIR 相机
    • 7.18.10。Qurv Technologies:从 ICFO 分拆出来的石墨烯/量子点图像传感器公司
    • 7.18.11。学术研究:来自汉阳大学的 QD-on-CMOS(韩国)
    • 7.18.12。学术研究:胶体量子点实现中红外传感
    • 7.18.13。学术研究:等离子纳米立方体制造廉价的短波红外相机
  • 7.19。总结:QD-on-CMOS 图像传感器
    • 7.19.1。总结:QD/OPD-on-CMOS 探测器
    • 7.19.2. QD-on-CMOS 图像传感器的 SWOT 分析
    • 7.19.3。供应商概览:QD-on-CMOS 混合图像传感器
    • 7.19.4。公司简介:混合 QD-on-CMOS 图像传感器

8. 薄膜光电探测器(有机和钙钛矿)

  • 8.1. 薄膜光电探测器(有机和钙钛矿)简介
  • 8.2. 有机光电探测器(OPD)
  • 8.3. 薄膜光电探测器:优点和缺点
  • 8.4. 减少暗电流以增加动态范围
  • 8.5。根据特定应用定制检测波长
  • 8.6. 将 OPD 扩展到 NIR 区域:腔体的使用
  • 8.7. 从解决方案制造薄膜光电探测器的技术挑战
  • 8.8. 薄膜光电探测器材料
  • 8.9. 薄膜有机和钙钛矿光电二极管(OPD 和 PPD):应用和主要参与者
    • 8.9.1。有机光电探测器的应用
    • 8.9.2. 用于生物识别安全的 OPD
    • 8.9.3. 用于医学成像的喷涂有机光电二极管
    • 8.9.4。ISORG:带有 OPD 的 "显示指纹"
    • 8.9.5。ISORG:使用 TFT 有源矩阵的靈活 OPD 应用
    • 8.9.6。ISORG:第一条OPD生产线
    • 8.9.7。剑桥显示技术:使用 OPD 进行脉搏血氧饱和度传感
    • 8.9.8。Holst 中心:基于钙钛矿的图像传感器
    • 8.9.9。大面积 OPD 应用面临的商业挑战
    • 8.9.10。薄膜光电探测器应用技术要求
    • 8.9.11。薄膜OPD和PPD应用要求
    • 8.9.12。薄膜 OPD 和 PPD 的应用评估 < li>8.9.13。按应用划分的有机和钙钛矿光电探测器的技术准备水平
  • 8.10. 有机和钙钛矿薄膜光电探测器(OPD 和 PPD):总结
    • 8.10.1。总结:薄膜有机和钙钛矿 p热探测器
    • 8.10.2. 大面积OPD图像传感器的SWOT分析
    • 8.10.3。供应商概览:薄膜光电探测器
    • 8.10.4。公司简介:有机光电二极管 (OPD)

9. 高光谱成像

  • 9.1. 高光谱成像简介
  • 9.2. 获取高光谱数据立方体的多种方法
  • 9.3. 用于高光谱数据采集的对比设备架构 (II)
  • 9.4. 线扫瞄(推扫式)相机非常适合传送带和卫星图像
  • 9.5。 "推扫帚" 和旧的高光谱成像方法之间的比较
  • 9.6. 线扫瞄高光谱相机设计
  • 9.7. 快照高光谱成像
  • 9.8. 高光谱成像照明 <我>9.9。用于多光谱/超光谱图像增强的全色锐化
  • 9.10。高光谱成像作为多光谱成像的发展
  • 9.11。高光谱和多光谱成像之间的权衡
  • 9.12。迈向宽带高光谱成像
  • 9.13。高光谱成像:应用
    • 9.13.1。高光谱成像与精准农业
    • 9.13.2. 来自 UAV(无人机)的高光谱成像
    • 9.13.3. 农业无人机生态系统发展
    • 9.13.4。使用高光谱相机进行卫星成像
    • 9.13.5。历史性的无人机投资创造了对高光谱成像的需求
    • 9.13.6。使用高光谱成像进行在线检测
    • 9.13.7。使用在线高光谱成像进行目标识别
    • 9.13.8。使用高光谱成像对物体进行分类回收
    • 9.13.9。使用高光谱成像进行食品检测
    • 9.13.10。用于皮肤诊断的高光谱成像
    • 9.13.11。高光谱成像应用requir对此语句
  • 9.14。高光谱成像:主要参与者
    • 9.14.1。比较高光谱相机制造商
    • 9.14.2. Specim:线扫瞄成像的市场领导者
    • 9.14.3. 顶墙光子学
    • 9.14.4。Cubert:快照光谱成像的专家
    • 9.14.5。高光谱成像波长范围
    • 9.14.6。高光谱波长范围与光谱分辨率
    • 9.14.7。高光谱相机参数表
    • 9.14.8。分析和应用高光谱成像的公司
    • 9.14.9。Condi Food:使用高光谱成像进行食品质量监测
    • 9.14.10。Orbital Sidekick:卫星高光谱成像
    • 9.14.11。Gamaya:用于农业分析的高光谱成像
  • 9.15。总结:高光谱成像
    • 9.15.1。总结:高光谱成像
    • 9.15.2。SWOT 分析:高光谱成像
    • 9.15.3。供应商概览:高光谱成像
    • 9.15.4。公司简介:Hypers光谱成像

10。基于事件的视觉(也称为动态视觉感知)

  • 10.1. 什么是基于事件的感知?
  • 10.2. 基于一般事件的传感:优点和缺点
  • 10.3. 什么是基于事件的视觉?(一世)
  • 10.4。什么是基于事件的视觉?(三)
  • 10.5。基于事件的视觉数据是什么样的?
  • 10.6. 基于事件的愿景:利弊
  • 10.7. 基于事件的视觉传感器可增加动态范围
  • 10.8。基于事件的视觉传感器的成本
  • 10.9. 基于事件的视觉软件的重要性
  • 10.10. 基于事件的视觉应用
    • 10.10.1。基于事件的视觉应用前景广阔
    • 10.10.2. 自动驾驶汽车的基于事件的视觉
    • 10.10 .3. 基于事件的无人机 (UAV) 防撞视觉
    • 10.10.4。智能建筑中的乘员跟踪(跌倒检测)
    • 10.10.5。增强/虚拟现实的基于事件的视觉
    • 10.10.6。用于光学对准/光束分析的基于事件的视觉
    • 10.10.7。基于事件的视觉应用要求
    • 10.10.8。应用程序基于事件的视觉技术准备水平
  • 10.11。基于事件的愿景:关键参与者
    • 10.11.1。EV基于ENT视觉:公司景觀
    • 10.11.2. IniVation:以有机增长为目标
    • 10.11.3。Prophesee:资金充足,目标是自主移动
    • 10.11.4。CelePixel:专注于硬件
    • 10.11.5。洞察力:针对无人机防撞的垂直集成模型
  • 10.12。总结:基于事件的视觉
    • 10.12.1。总结:基于事件的视觉
    • 10.12.2. SWOT 分析:基于事件的愿景
    • 10.12.3. 供应商概述:基于事件的VISI上
    • 10.12.4。公司简介:基于事件的愿景

11。波前成像(也称为相位成像)

  • 11.1. 波前成像的动机
  • 11.2. 传统的 Shack-Hartman 波前传感器
  • 1 1.3. Phasics:波前成像的创新者
  • 11.4. Wooptix:光场和波前成像
  • 11.5。总结:波前成像
  • 11.6. SWOT 分析:波前成像
  • 11.7. 供应商概览:波前成像传感器
  • 11.8。公司简介:波前成像

12。靈活且直接的 X 射线图像传感器

  • 12.1. 传统的 X 射线传感
  • 12.2. 基于非晶矽的柔性图像传感器
  • 12.3. 用于医疗成像的喷涂有机光电二极管
  • 12.4. 使用有机半导体进行直接 X 射线传感
  • 12.5。Holst 中心开发基于钙钛矿的 X 射线传感器 (I)
  • 12.6. Holst 中心开发基于钙钛矿的 X 射线传感器(二)
  • 12.7. 靈活和直接的 X 射线传感器的技术准备水平
  • 12.8。总结:靈活且直接的 X 射线图像传感器
  • 12.9。SWOT 分析:靈活且直接的 X 射线图像传感器
  • 12.10。供应商概览:靈活的 X 射线图像传感器
  • 12.11。公司简介:靈活且直接的 X 射线图像传感器
目录
Product Code: ISBN 9781913899530

Title:
Emerging Image Sensor Technologies 2021-2031: Applications and Markets
Innovative image sensor for autonomous vehicles, UAVs, precision agriculture and industrial automation. Includes organic photodetectors, SWIR image sensors, event-based vision, hyperspectral imaging, flexible x-ray detectors, and wavefront imaging.

"Autonomous technologies will lead the market for emerging image sensors to $360 million by 2031."

Image sensing is a highly important capability, used in applications ranging from webcams and smartphone cameras to autonomous vehicles and industrial inspection. This report from IDTechEx comprehensively explores the market for emerging image sensors, covering a diverse range of technologies than span from thin-film flexible photodetectors to event-based vision.

While conventional CMOS detectors for visible light are well established and somewhat commoditized, at least for low value applications, there is an extensive opportunity for more complex image sensors that offer capabilities beyond that of simply acquiring red, green and blue (RGB) intensity values. As such, extensive effort is currently being devoted to developing emerging image sensor technologies that can detect aspects of light beyond human vision. This includes imaging over a broader spectral range, over a larger area, acquiring spectral data at each pixel, and simultaneously increasing temporal resolution and dynamic range.

Much of this opportunity stems from the ever-increasing adoption of machine vision, in which image analysis is performed by computational algorithms. Machine learning requires as much input data as possible to establish correlations that can facilitate object identification and classification, so acquiring optical information over a different wavelength range, or with spectral resolution for example, is highly advantageous.

Of course, emerging image sensor technologies offer many other benefits. Depending on the technology this can include similar capabilities at a lower cost, increased dynamic range, improve temporal resolution, spatially variable sensitivity, global shutters at high resolution, reducing the unwanted influence of scattering, flexibility/conformality and more. A particularly important trend is the development of much cheaper alternatives to very expensive InGaAs sensors for imaging in the short-wave infra-red (SWIR, 1000 - 2000 nm) spectral region, which will open up this capability to a much wider range of applications. This includes autonomous vehicles, in which SWIR imaging assists with distinguishing objects/materials that appear similar in the visible spectrum, while also reducing scattering from dust and fog.

The report covers the following technologies:

  • Quantum dots on silicon hybrid image sensors
  • Organic photodetectors on silicon hybrid image sensors
  • Emerging SWIR image sensor technologies
  • Organic and perovskite photodiodes (OPDs and PPDs)
  • Event-based vision
  • Hyperspectral imaging
  • Flexible x-ray image sensors
  • Wavefront imaging
  • Hybrid image sensors. Adding an additional light absorbing layer on top of a CMOS read-out circuit is a hybrid approach that utilizes either organic semiconductors or quantum dots to increase the spectral sensitivity into the SWIR region. Currently dominated by expensive InGaAs sensors, this new technology promises a substantial price reduction and hence adoption of SWIR imaging for new applications such as autonomous vehicles.
  • Extended-range silicon. Given the very high price of InGaAs sensors, there is considerable motivation to develop much lower cost alternatives that can detect light towards the lower end of the SWIR spectral region. Such SWIR sensors could then be employed in vehicles to provide better vision through fog and dust due to reduced scattering.
  • Thin film photodetectors. Detection of light over a large area, rather than at a single small detector, is highly desirable for acquiring biometric data and, if flexible, for imaging through the skin. At present, the high cost of silicon means that large-area image sensors can be prohibitively expensive. However, emerging approaches that utilize solution processable semiconductors offer a compelling way produce large-area conformal photodetectors. Printed organic photodetectors (OPDs) are the most developed approach, with under-display fingerprint detection being actively explored.
  • Event-based vision: Autonomous vehicles, drones and high-speed industrial applications require image sensing with a high temporal resolution. However, with conventional frame-based imaging a high temporal resolution produces vast amounts of data that requires computationally intensive processing. Event-based vision, also known as dynamic vision sensing (DVS), is an emerging technology that resolves this challenge. It is a completely new way of thinking about obtaining optical information, in which each sensor pixel reports timestamps that correspond to intensity changes. As such, event-based vision can combine greater temporal resolution of rapidly changing image regions, with much reduced data transfer and subsequent processing requirements.
  • Hyperspectral imaging: Obtaining as much information as possible from incident light is highly advantageous for applications that require object identification, since classification algorithms have more data to work with. Hyperspectral imaging, in which a complete spectrum is acquired at each pixel to product an (x, y, λ) data cube using a dispersive optical element and an image sensor, is a relatively established technology that has gained traction for precision agriculture and industrial process inspection. However, at present most hyperspectral cameras work on a line-scan principle, while SWIR hyperspectral imaging is restricted to relatively niche applications due to the high cost of InGaAs sensors. Emerging technologies look set to disrupt both these aspects, with snapshot imaging offering an alternative to line-scan cameras and with the new SWIR sensing technologies outlined above facilitating cost reduction and adoption for a wider range of applications.
  • Flexible x-ray sensors: X-ray sensors are well-established and highly important for medical and security applications. However, the difficulty in focusing x-rays means that sensors need to cover a large area. Furthermore, since silicon cannot effectively absorb x-rays a scintillator layer is commonly used. However, both these aspects increase sensor size and weight, making x-ray detectors bulky and unwieldy. Flexible x-ray sensors based on an amorphous silicon backplane offer a compelling alternative, since they would be lighter and conformal (especially useful for imaging curved body parts). Looking further ahead, direct x-ray sensors based on solution processable semiconductors offer reduced weight and complexity along with the potential for higher spatial resolution.
  • Wavefront imaging: Wavefront (or phase) imaging enables the extraction of phase information from incident light that is lost by a conventional sensor. This is technique is currently used for niche applications such as optical component design/inspection and ophthalmology. However, recent advances have led to significant resolution improvements which will allow this technology to be applied somewhat more widely. Biological imaging is one of the more promising emerging applications, in which collecting phase along with intensity reduces the influence of scattering and thus enables better defined images.

In summary, increasing adoption of computational image analysis provides a great opportunity for image sensing technologies that offer capabilities beyond conventional CMOS sensors. This report offers a comprehensive overview of the market for emerging image sensor technologies and associated technical developments, covering a multitude of applications that range from autonomous vehicles to industrial quality control. Expect to see many of these exciting and innovative imaging technologies being rapidly adopted over the next decade.

The following information is included within the report:

  • Executive summary & conclusions.
  • Detailed technical analysis of the emerging image sensor technologies outlined above.
  • Highly granular 10-year market forecasts, split by technology and subsequently by application. This includes over 40 individual forecast categories. Forecasts are expressed by both volume and revenue.
  • Technological/commercial readiness assessments, split by technology and application.
  • Commercial motivation for developing and adopting each of the emerging image sensing technologies.
  • Multiple application case studies for each image sensing technology.
  • SWOT analysis of each image sensing technology.
  • Overview of the key players within each technology category.
  • Over 25 company profiles, the majority based on recent primary interviews. These include a discussion of current status, technology, potential markets and business model, along with company financial information (where disclosed) and our SWOT analysis.
  • Selected highlights from academic research relevant to emerging image sensor technologies.

Analyst access from IDTechEx

All report purchases include up to 30 minutes telephone time with an expert analyst who will help you link key findings in the report to the business issues you're addressing. This needs to be used within three months of purchasing the report.

TABLE OF CONTENTS

1. EXECUTIVE SUMMARY

  • 1.1. Key takeaways
  • 1.2. Conventional image sensors: Market overview
  • 1.3. Motivation for short-wave infra-red (SWIR) imaging
  • 1.4. SWIR imaging: Incumbent and emerging technology options
  • 1.5. Opportunities for SWIR image sensors
  • 1.6. SWIR sensors: Application overview
  • 1.7. OPD-on-CMOS hybrid image sensors
  • 1.8. Quantum dots as optical sensor materials
  • 1.9. Prospects for QD/OPD-on-CMOS detectors
  • 1.10. Challenges for QD-Si technology for SWIR imaging
  • 1.11. Overview of thin film organic and perovskite photodetectors
  • 1.12. Applications of organic photodetectors.
  • 1.13. Introduction to hyperspectral imaging
  • 1.14. Overview of hyperspectral imaging
  • 1.15. What is event-based vision?
  • 1.16. Promising applications for event-based vision
  • 1.17. Overview of event-based vision
  • 1.18. Overview of wavefront imaging
  • 1.19. Overview of flexible and direct x-ray image sensors
  • 1.20. 10-year market forecast for emerging image sensor technologies
  • 1.21. 10-year market forecast for emerging image sensor technologies (by volume)
  • 1.22. 10-year market forecast for emerging image sensor technologies (by volume, data table)
  • 1.23. 10-year market forecast for emerging image sensor technologies (by revenue)
  • 1.24. 10-year market forecast for emerging image sensor technologies (by revenue, data table)

2. INTRODUCTION

  • 2.1. What is a sensor?
  • 2.2. Sensor value chain example: Digital camera
  • 2.3. Photodetector working principles
  • 2.4. Quantifying photodetector and image sensor performance
  • 2.5. Extracting as much information as possible from light
  • 2.6. Autonomous vehicles will need machine vision
  • 2.7. Trends in autonomous vehicle adoption
  • 2.8. What are the levels of automation in cars?
  • 2.9. Global autonomous car market
  • 2.10. How many camera needed in different automotive autonomy levels
  • 2.11. Growing drone uses provides extensive market for emerging image sensors
  • 2.12. Emerging image sensors required for drones

3. MARKET FORECASTS

  • 3.1. Market forecast methodology
  • 3.2. Parametrizing forecast curves
  • 3.3. Determining total addressable markets (TAMs)
  • 3.4. Determining revenues
  • 3.5. 10-year short-wave infra-red (SWIR) image sensors market forecast: by volume
  • 3.6. 10-year hybrid OPD-on-CMOS image sensors market forecast: by volume
  • 3.7. 10-year hybrid OPD-on-CMOS image sensors market forecast: by revenue
  • 3.8. 10-year hybrid QD-on-CMOS image sensors market forecast: by volume
  • 3.9. 10-year hybrid QD-on-CMOS image sensors market forecast: by revenue
  • 3.10. 10-year thin film organic and perovskite photodetectors (OPDs and PPDs) market forecast: by volume
  • 3.11. 10-year thin film organic and perovskite photodetectors (OPDs and PPDs) market forecast: by revenue
  • 3.12. 10-year hyperspectral imaging market forecast: by volume
  • 3.13. 10-year hyperspectral imaging market forecast: by revenue
  • 3.14. 10-year event-based vision market forecast: by volume
  • 3.15. 10-year event-based vision market forecast: by revenue
  • 3.16. 10-year wavefront imaging market forecast: by volume
  • 3.17. 10-year wavefront imaging market forecast: by revenue
  • 3.18. 10-year flexible x-ray image sensors market forecast: by volume

4. BRIEF OVERVIEW OF ESTABLISHED VISIBLE IMAGE SENSORS (CCD AND CMOS)

  • 4.1. Conventional image sensors: Market overview
  • 4.2. Key components in a CMOS image sensor (CIS)
  • 4.3. Sensor architectures: Front and backside illumination
  • 4.4. Process flow for back-side-illuminated CMOS image sensors
  • 4.5. Comparing CMOS and CCD image sensors
  • 4.6. Benefits of global rather than rolling shutters

5. SHORT-WAVE INFRA-RED (SWIR) IMAGE SENSORS

  • 5.1. Motivation for short-wave infra-red (SWIR) imaging
  • 5.2. SWIR imaging reduces light scattering
  • 5.3. SWIR: Incumbent and emerging technology options
  • 5.4. Applications for SWIR imaging
    • 5.4.1. Applications for SWIR imaging
    • 5.4.2. Identifying water content with SWIR imaging
    • 5.4.3. SWIR for autonomous mobility
    • 5.4.4. SWIR imaging enables better hazard detection
    • 5.4.5. SWIR enables imaging through silicon wafers
    • 5.4.6. Imaging temperature with SWIR light
    • 5.4.7. Visualization of foreign materials during industrial inspection
    • 5.4.8. Spectroscopic chemical sensors
    • 5.4.9. SWIR image sensing for industrial process optimization
    • 5.4.10. MULTIPLE (EU Project): Focus areas, targets and participants
    • 5.4.11. SWIR spectroscopy: Wearable applications
    • 5.4.12. SWIR spectroscopy: Determining water and body temperature via wearable technology
    • 5.4.13. SWIR spectroscopy: Alcohol detection
    • 5.4.14. SWIR image sensors for hyperspectral imaging
    • 5.4.15. SWIR sensors: Application overview
    • 5.4.16. SWIR application requirements
  • 5.5. InGaAs sensors - existing technology for SWIR imaging
    • 5.5.1. Existing long wavelength detection: InGaAs
    • 5.5.2. The challenge of high resolution, low cost IR sensors
    • 5.5.3. InGaAs sensor design: Solder bumps limit resolution
    • 5.5.4. Sony improve InGaAs sensor resolution and spectral range
  • 5.6. Emerging inorganic SWIR technologies and players
    • 5.6.1. Trieye: Innovative silicon based SWIR sensors
    • 5.6.2. OmniVision: Making silicon CMOS sensitive to NIR (II)
    • 5.6.3. SWOT analysis: SWIR image sensors (non-hybrid, non-InGaAs)
    • 5.6.4. Supplier overview: Emerging SWIR image sensors
    • 5.6.5. Company profiles: SWIR imaging (excluding hybrid approaches)

6. HYBRID OPD-ON-CMOS IMAGE SENSORS (INCLUDING FOR SWIR)

  • 6.1. OPD-on-CMOS hybrid image sensors
  • 6.2. Hybrid organic/CMOS sensor
  • 6.3. Hybrid organic/CMOS sensor for broadcast cameras
  • 6.4. Comparing hybrid organic/CMOS sensor with backside illumination CMOS sensor
  • 6.5. Progress in CMOS global shutter using silicon technology only
  • 6.6. Fraunhofer FEP: SWIR OPD-on-CMOS sensors (I)
  • 6.7. Fraunhofer FEP: SWIR OPD-on-CMOS sensors (II)
  • 6.8. Academic research: Twisted bilayer graphene sensitive to longer wavelength IR light
  • 6.9. Technology readiness level of OPD-on-CMOS detectors by application
  • 6.10. SWOT analysis of OPD-on-CMOS image sensors
  • 6.11. Supplier overview: OPD-on-CMOS hybrid image sensors
  • 6.12. Company profiles: OPD-on-CMOS

7. HYBRID QD-ON-CMOS IMAGE SENSORS

  • 7.1. Quantum dots as optical sensor materials
  • 7.2. Lead sulphide as quantum dots
  • 7.3. Quantum dots: Choice of the material system
  • 7.4. Applications and challenges for quantum dots in image sensors
  • 7.5. QD layer advantage in image sensor (I): Increasing sensor sensitivity and gain
  • 7.6. QD-Si hybrid image sensors(II): Reducing thickness
  • 7.7. Detectivity benchmarking (I)
  • 7.8. Detectivity benchmarking (II)
  • 7.9. Hybrid QD-on-CMOS with global shutter for SWIR imaging.
  • 7.10. QD-Si hybrid image sensors: Enabling high resolution global shutter
  • 7.11. QD-Si hybrid image sensors(IV): Low power and high sensitivity to structured light detection for machine vision?
  • 7.12. How is the QD layer applied?
  • 7.13. Advantage of solution processing: ease of integration with ROIC CMOS?
  • 7.14. QD optical layer: Approaches to increase conductivity of QD films
  • 7.15. Quantum dots: Covering the range from visible to near infrared
  • 7.16. SWIR sensitivity of PbS QDs, Si, polymers, InGaAs, HgCdTe, etc...
  • 7.17. Hybrid QD-on-CMOS image sensors: Processing
    • 7.17.1. Value chain and production steps for QD-on-CMOS
    • 7.17.2. Advantage of solution processing: Ease of integration with CMOS ROICs?
    • 7.17.3. Quantum dot films: Processing challenges
    • 7.17.4. QD-on-CMOS with graphene interlayer
    • 7.17.5. Challenges for QD-Si technology for SWIR imaging
    • 7.17.6. QD-on-CMOS sensors: Ongoing technical challenges
    • 7.17.7. Technology readiness level of QD-on-CMOS detectors by application
  • 7.18. Hybrid QD-on-CMOS image sensors: Key players
    • 7.18.1. SWIR Vision Systems: Hybrid quantum dots for SWIR imaging
    • 7.18.2. SWIR Vision Sensors: First commercial QD-CMOS cameras
    • 7.18.3. IMEC: QD-on-CMOS integration examples (I)
    • 7.18.4. IMEC: QD-on-CMOS integration examples (II)
    • 7.18.5. RTI International: QD-on-CMOS integration examples
    • 7.18.6. QD-on-CMOS integration examples (ICFO continued)
    • 7.18.7. Emberion: QD-graphene SWIR sensor
    • 7.18.8. Emberion: QD-Graphene-Si broadrange SWIR sensor
    • 7.18.9. Emberion: VIS-SWIR camera with 400 to 2000 nm spectral range
    • 7.18.10. Qurv Technologies: Graphene/quantum dot image sensor company spun off from ICFO
    • 7.18.11. Academic research: QD-on-CMOS from Hanyang University (South Korea)
    • 7.18.12. Academic research: Colloidal quantum dots enable mid-IR sensing
    • 7.18.13. Academic research: Plasmonic nanocubes make a cheap SWIR camera
  • 7.19. Summary: QD-on-CMOS image sensors
    • 7.19.1. Summary: QD/OPD-on-CMOS detectors
    • 7.19.2. SWOT analysis of QD-on-CMOS image sensors
    • 7.19.3. Supplier overview: QD-on-CMOS hybrid image sensors
    • 7.19.4. Company profiles: Hybrid QD-on-CMOS image sensors

8. THIN FILM PHOTODETECTORS (ORGANIC AND PEROVSKITE)

  • 8.1. Introduction to thin film photodetectors (organic and perovskite)
  • 8.2. Organic photodetectors (OPDs)
  • 8.3. Thin film photodetectors: Advantages and disadvantages
  • 8.4. Reducing dark current to increase dynamic range
  • 8.5. Tailoring the detection wavelength to specific applications
  • 8.6. Extending OPDs to the NIR region: Use of cavities
  • 8.7. Technical challenges for manufacturing thin film photodetectors from solution
  • 8.8. Materials for thin film photodetectors
  • 8.9. Thin film organic and perovskite photodiodes (OPDs and PPDs): Applications and key players
    • 8.9.1. Applications of organic photodetectors
    • 8.9.2. OPDs for biometric security
    • 8.9.3. Spray-coated organic photodiodes for medical imaging
    • 8.9.4. ISORG: 'Fingerprint on display' with OPDs
    • 8.9.5. ISORG: Flexible OPD applications using TFT active matrix
    • 8.9.6. ISORG: First OPD production line
    • 8.9.7. Cambridge display technology: Pulse oximetry sensing with OPDs
    • 8.9.8. Holst Center: Perovskite based image sensors
    • 8.9.9. Commercial challenges for large-area OPD adoption
    • 8.9.10. Technical requirements for thin film photodetector applications
    • 8.9.11. Thin film OPD and PPD application requirements
    • 8.9.12. Application assessment for thin film OPDs and PPDs
    • 8.9.13. Technology readiness level of organic and perovskite photodetectors by applications
  • 8.10. Organic and perovskite thin film photodetectors (OPDs and PPDs): Summary
    • 8.10.1. Summary: Thin film organic and perovskite photodetectors
    • 8.10.2. SWOT analysis of large area OPD image sensors
    • 8.10.3. Supplier overview: Thin film photodetectors
    • 8.10.4. Company profiles: Organic photodiodes (OPDs)

9. HYPERSPECTRAL IMAGING

  • 9.1. Introduction to hyperspectral imaging
  • 9.2. Multiple methods to acquire a hyperspectral data-cube
  • 9.3. Contrasting device architectures for hyperspectral data acquisition (II)
  • 9.4. Line-scan (pushbroom) cameras ideal for conveyor belts and satellite images
  • 9.5. Comparison between 'push-broom' and older hyperspectral imaging methods
  • 9.6. Line-scan hyperspectral camera design
  • 9.7. Snapshot hyperspectral imaging
  • 9.8. Illumination for hyperspectral imaging
  • 9.9. Pansharpening for multi/hyper-spectral image enhancement
  • 9.10. Hyperspectral imaging as a development of multispectral imaging
  • 9.11. Trade-offs between hyperspectral and multi spectral imaging
  • 9.12. Towards broadband hyperspectral imaging
  • 9.13. Hyperspectral imaging: Applications
    • 9.13.1. Hyperspectral imaging and precision agriculture
    • 9.13.2. Hyperspectral imaging from UAVs (drones)
    • 9.13.3. Agricultural drones ecosystem develops
    • 9.13.4. Satellite imaging with hyperspectral cameras
    • 9.13.5. Historic drone investment creates demand for hyperspectral imaging
    • 9.13.6. In-line inspection with hyperspectral imaging
    • 9.13.7. Object identification with in-line hyperspectral imaging
    • 9.13.8. Sorting objects for recycling with hyperspectral imaging
    • 9.13.9. Food inspection with hyperspectral imaging
    • 9.13.10. Hyperspectral imaging for skin diagnostics
    • 9.13.11. Hyperspectral imaging application requirements
  • 9.14. Hyperspectral imaging: Key players
    • 9.14.1. Comparing hyperspectral camera manufacturers
    • 9.14.2. Specim: Market leaders in line-scan imaging
    • 9.14.3. Headwall Photonics
    • 9.14.4. Cubert: Specialists in snapshot spectral imaging
    • 9.14.5. Hyperspectral imaging wavelength ranges
    • 9.14.6. Hyperspectral wavelength range vs spectral resolution
    • 9.14.7. Hyperspectral camera parameter table
    • 9.14.8. Companies analysing and applying hyperspectral imaging
    • 9.14.9. Condi Food: Food quality monitoring with hyperspectral imaging
    • 9.14.10. Orbital Sidekick: Hyperspectral imaging from satellites
    • 9.14.11. Gamaya: Hyperspectral imaging for agricultural analysis
  • 9.15. Summary: Hyperspectral imaging
    • 9.15.1. Summary: Hyperspectral imaging
    • 9.15.2. SWOT analysis: Hyperspectral imaging
    • 9.15.3. Supplier overview: Hyperspectral imaging
    • 9.15.4. Company profiles: Hyperspectral imaging

10. EVENT-BASED VISION (ALSO KNOWN AS DYNAMIC VISION SENSING)

  • 10.1. What is event-based sensing?
  • 10.2. General event-based sensing: Pros and cons
  • 10.3. What is event-based vision? (I)
  • 10.4. What is event-based vision? (III)
  • 10.5. What does event-based vision data look like?
  • 10.6. Event-based vision: Pros and cons
  • 10.7. Event-based vision sensors enable increased dynamic range
  • 10.8. Cost of event-based vision sensors
  • 10.9. Importance of software for event-based vision
  • 10.10. Applications for event-based vision
    • 10.10.1. Promising applications for event-based vision
    • 10.10.2. Event-based vision for autonomous vehicles
    • 10.10.3. Event-based vision for unmanned aerial vehicle (UAV) collision avoidance
    • 10.10.4. Occupant tracking (fall detection) in smart buildings
    • 10.10.5. Event-based vision for augmented/virtual reality
    • 10.10.6. Event-based vision for optical alignment / beam profiling
    • 10.10.7. Event-based vision application requirements
    • 10.10.8. Technology readiness level of event-based vision by application
  • 10.11. Event-based vision: Key players
    • 10.11.1. Event-based vision: Company landscape
    • 10.11.2. IniVation: Aiming for organic growth
    • 10.11.3. Prophesee: Well-funded and targeting autonomous mobility
    • 10.11.4. CelePixel: Focussing on hardware
    • 10.11.5. Insightness: Vertically integrated model targeting UAV collision avoidance
  • 10.12. Summary: Event-based vision
    • 10.12.1. Summary: Event-based vision
    • 10.12.2. SWOT analysis: Event-based vision
    • 10.12.3. Supplier overview: Event-based vision
    • 10.12.4. Company profiles: Event-based vision

11. WAVEFRONT IMAGING (ALSO KNOW AS PHASE-BASED IMAGING)

  • 11.1. Motivation for wavefront imaging
  • 11.2. Conventional Shack-Hartman wavefront sensors
  • 11.3. Phasics: Innovators in wavefront imaging
  • 11.4. Wooptix: Light-field and wavefront imaging
  • 11.5. Summary: Wavefront imaging
  • 11.6. SWOT analysis: Wavefront imaging
  • 11.7. Supplier overview: Wavefront imaging sensors
  • 11.8. Company profiles: Wavefront imaging

12. FLEXIBLE AND DIRECT X-RAY IMAGE SENSORS

  • 12.1. Conventional x-ray sensing
  • 12.2. Flexible image sensors based on amorphous-Si
  • 12.3. Spray-coated organic photodiodes for medical imaging
  • 12.4. Direct x-ray sensing with organic semiconductors
  • 12.5. Holst Center develop perovskite-based x-ray sensors (I)
  • 12.6. Holst Center develop perovskite-based x-ray sensors (II)
  • 12.7. Technology readiness level of flexible and direct x-ray sensors
  • 12.8. Summary: Flexible and direct x-ray image sensors
  • 12.9. SWOT analysis: Flexible and direct x-ray image sensors
  • 12.10. Supplier overview: Flexible x-ray image sensors
  • 12.11. Company profiles: Flexible and direct x-ray image sensors