Thanks to new processes, architectures, and technology advances, CMOS image sensors are finding homes well beyond the mobile phone and digital camera markets and moving into a wider range of products, including more demanding machine-vision and automotive applications. Thanks to these breakthroughs in CMOS image sensor technology, designers are replacing more expensive charge-coupled device (CCD)-based image sensors that are used in applications that require ultra-high resolution.
Improvements range from better viewing in all lighting conditions and excellent LED flicker mitigation to much lower-motion artifacts at high dynamic range and lower power consumption.
When it comes to selecting the right CMOS image sensors for your design, it is more complex than you think, and specifications aren’t always what they seem. Engineers who use CMOS image sensors in their designs need to dive deeper into some specs, such as optical format (sensor size), dynamic range, and even field of view, that can often cause some confusion or misunderstandings. They also need to pay attention to other factors like optics, thermal management, and power consumption.
It’s a complex multi-disciplinary domain that requires users of CMOS image sensors to pay attention to the interaction with the optics, thermals, and power supply, among other issues, said Jon Stern, director of optical systems at GoPro.
Stern also cited several “weird” legacy specifications that can potentially trip up designers, like not understanding that the optical format is not the diagonal size of the sensor. The image size in inches was inherited from the days when the optical format was defined by the mechanical outside diameter of the vidicon imaging tube, so there are a lot of little things that can be pitfalls, he said.
The diagonal size of the image sensor is important because it directly impacts the size of the lens. So, for example, the actual diagonal size of an optical format of 1/2.3 inches is 11 mm.
Another potential pitfall is modulation transfer function (MTF), a measure of the sharpness of contrast, provided by lens manufacturers, which gives designers an indication of optical performance. “Again, this is a weird historical legacy specification,” said Stern. “What’s quoted by the lens manufacturer is the ‘as designed’ performance of the lens, which doesn’t account for manufacturing tolerances. So when you buy a lens, the performance will be less than the stated MTF. If you want to get ‘as built’ performance, you need to ask the manufacturer for Monte Carlo simulations or some kind of test spec that you can characterize yourself.”
Stern is hosting a technical session, “CMOS image sensors: A guide to building the eyes of a vision system,” at the upcoming virtual Embedded Vision Summit, Sept. 15–17, where he discusses many of these legacy specifications. He also will cover the basics of image sensors — types, characteristics, how to select the right sensor for your application, and practical tips on matching the sensor with optics for building a camera module. His presentation is one of several that focuses on image sensors.
Stern shared a sneak peek into some of the discussion areas and his insight into key trends in the image sensor market. While he plans to discuss the basic operation, types, and characteristics of CMOS image sensors in his presentation, a big part of the session is about how to select the right sensor for your application and providing practical guidelines for building a camera module by pairing the sensor with the right optics.
A lot of image sensor presentations focus on the nuts and bolts inside an image sensor, and while that is interesting, he said, it’s not really what you need to know in order to use image sensors. He plans to focus on areas that demand special attention and common pitfalls in designing imaging systems. “The focus is giving people an overview of the domains where attention needs to be paid as a user of CMOS image sensors and to provide some guidelines on how to start the process — how do you select an image sensor, how do you select the optics,” he said.
Even if a designer decides to use a module and has found the right image sensor, it’s very likely that it’s not going to be paired with the right optics for the application, he noted.
So a big part of Stern’s discussion will cover how to match the optics with the sensor and what specifications you need to pay close attention to, like optical format, MTF, and dynamic range — which is frequently misquoted by sensor manufacturers, according to Stern — lens field of view, and guidelines on understanding chief ray angle.
Specialization, faster interfaces, and tradeoffs
At a higher level, though, there are a number of trends that are impacting the CMOS image sensor market, like specialized functionality required for specific applications, faster interfaces, and better thermal management.
Stern said that there always has been specialization, and that goes back to the early days of CCDs, but in general, they were less diverse than they are now because of today’s applications. He cited advanced driver-assistance systems as an example that require special functions added to the sensors to support the automotive requirements. Two requirements that have become essential for automotive is ASIL compliance and LED flicker mitigation.
“LED flicker mitigation is maturing, but certainly that’s been starting to become table stakes now for a lot of the automotive sensors,” said Stern.
“So those specialist functions are now starting to show up,” he said. “Equally, and it’s been a long time coming, but at some point in the future, there’ll be more CMOS image sensors used for machine vision.”
But, he explained, that there are always tradeoffs in functionality, and it’s generally around signal-to-noise ratio (SNR) and optical format.
But, again, it depends on the applications. SNR generally won’t be a challenge in classical machine-vision systems, but with mobile systems, they have to operate in a range of illumination conditions, Stern said.
Every system, whether it is GoPro cameras producing pictures and videos for human beings to share or dealing with recognizing objects, you’re almost always challenged by SNR, and you’ve got a tradeoff between the size of the image sensor, which is also a cost and physical size tradeoff, he said.
Designers also have to consider other tradeoffs that go hand in hand, like dynamic range, resolution, and motion artifacts.
In addition, designers aren’t paying enough attention to thermal management — keeping the image sensors cool, the noise characteristics of the sensor including read noise, and dark current performance.
Because CMOS image sensors are highly integrated now — they’re systems on a chip, which means you don’t need a huge amount of domain expertise to use them because it’s all controlled by digital interfaces, said Stern. It is common for designers to just look at the datasheets and say, “OK, the maximum operating temperature is 70°C if it’s commercial or extended beyond that for industrial and automotive,” he added.
The problem is that leakage of the pixel dark current (which causes performance issues) doubles about every six degrees, and a lot of other noise gets worse if you don’t pay attention to the temperature, he said.
There are a number of techniques that can be implemented to handle thermal management, but it has to be considered from the outset of the design; otherwise, it could cause huge problems in product development.
Stern cited the case of a Korean electronics manufacturer that was still in the transition of moving from CCDs to CMOS image sensors in 2007. The company was building a consumer camcorder with CMOS image sensor technology but didn’t pay attention to cooling the sensor. Several problems arose as a result, but the company couldn’t find a good way to put a Band-Aid on the problems, and the product was scrapped.
“Though this is an extreme example, as a former application engineer for an image sensor company, we frequently would encounter users of image sensors who see them as simple-to-use devices with simple digital interfaces, and as a result, they would pay inadequate attention to thermals and then have to Band-Aid it at a late stage in product development,” said Stern.
“For GoPro — without revealing too much — we’re dealing with high frame rates, high data rates, and, of course, we have these rather small battery-powered handheld systems and they are waterproof, which makes it even harder to expel heat, but this is a constant challenge of balancing the heat in the system and the sensor being one of the primary components we have to pay attention to,” he said.
And there are other factors to consider, especially when selecting the right sensor for your products. In the case of GoPro, that means support for slow-motion capture at high frame rates with 1080p support.
This also requires fast interfaces, which, for image sensors, is “a bit of mess at the moment,” said Stern. There are so many standards to choose from and that’s something that plagues the image sensor interface world, he added.
One interface that is gaining traction is MIPI C-PHY. Stern believes it will start to dominate in the area of high-speed sensors, and the big hope is that it will start to displace some of the proprietary interfaces and become a widely adopted standard.
He also said to keep an eye on hybrid [die] stacking, which will yield substantial advances for CMOS image sensors in the future. It has the potential to deliver the performance benefits of both a rolling shutter and global shutter.
If you’re especially interested in image sensors, there also will be a panel discussion on the future of image sensors, sponsored by Applied Materials. Panelists include Solidspac3, Aurora, and OmniVision. The panel will cover trends such as integrating processors on the same die with sensors, ultra-low-power sensors, and new image-sensing approaches, including event-based and neuromorphic sensing, as well as new techniques for optical depth sensing and hyperspectral imaging.
The Embedded Vision Summit will be virtual this year, but it will still provide 75+ technical sessions and tutorials that cover the same wide range of topics from artificial learning and machine learning to image sensors, microprocessors, and SoCs for a range of applications. There also will be 50+ exhibitors on the virtual hall floor.