Author / Abbao
Editor / Abbao
Produced by / Abbao1990
1. Pinhole Imaging
In the early Warring States period, Chinese scholar Mozi (468 BC – 376 BC) and his disciples completed the world’s first experiment on pinhole imaging, recorded in the “Mo Jing”: “When the scene arrives, it has an endpoint, and the scene is long. The light travels in a straight line, and this is the first scientific explanation of the linear propagation of light.”
This article explains the reasons for pinhole imaging and points out the property of light traveling in straight lines. This was the first scientific explanation of the linear propagation of light.
In the Western world, it wasn’t until 350 BC that the ancient Greek scholar Aristotle proposed optical laws, and from then on, Westerners understood the optical principles of pinhole imaging.
2. Camera Obscura
At the end of the 15th century, people made a camera obscura based on the principle of pinhole imaging, which is essentially a prototype of the camera. Italian Leonardo da Vinci documented this camera obscura in his writings, describing how people used this tool for sketching and painting.
In 1553, Italian Porta published a book titled “Natural Magic” in which he detailed the use of the camera obscura. By simply reflecting the image onto paper with a pencil, one could outline and then color it, completing a highly realistic image that adhered to true proportions.
3. Silver Plate Photography
In 1822, Frenchman Niepce took the world’s first photograph on photosensitive material, although the image was not very clear and required an 8-hour exposure. In 1826, he took the earliest existing photograph on a tin plate coated with photosensitive asphalt through a camera obscura.
In 1838, French physicist Daguerre invented the Daguerreotype process, which used a silver iodide-coated steel plate for exposure in a camera obscura, then developed with mercury vapor and fixed with common salt. This method produced a metal negative image that was very clear and could be permanently preserved. Subsequently, Daguerre made the world’s first camera based on this method, which required an exposure time of 20-30 minutes.
On August 19, 1839, the French government announced the abandonment of the patent for the silver plate photography invention and made it public. This day is generally regarded as the beginning of photography.
4. Film Cameras & Digital Cameras
In 1866, German chemist Schott and optician Abbe invented barium crown optical glass at Zeiss, producing positive photographic lenses, rapidly advancing the design and manufacture of photographic lenses.
With the development of photosensitive materials, in 1871, dry plates coated with silver bromide photosensitive materials appeared, and in 1884, film made from nitrocellulose as a substrate emerged.
In 1888, Kodak produced a new type of photosensitive material – a soft, rollable “film”. This was a leap in photosensitive materials. In the same year, Kodak invented the world’s first portable box camera that used film. Kodak No.1 was the home camera launched by Kodak, whose slogan was “Press the shutter, and I will do the rest.” This camera can be regarded as the ancestor of consumer cameras.
In 1969, CCD chips were used as photosensitive materials in cameras mounted on the Apollo lunar spacecraft in the United States, laying the technical foundation for the electronicization of photographic photosensitive materials.
In 1981, Sony produced the world’s first camera that used CCD electronic sensors as photosensitive materials after years of research, laying the foundation for electronic sensors to replace film. Subsequently, Panasonic, Copal, Fuji, and some electronic chip manufacturers from the United States and Europe invested in the technical research and development of CCD chips, laying the technical foundation for the development of digital cameras.
In 1987, cameras using CMOS chips as photosensitive materials were born at Casio.
Looking back at the development of cameras:
5. Explanation of Internal Structure and Working Mechanism of Camera Modules:
Lens: Gathers light and projects the scene onto the imaging medium. Some are single lenses, while others require multiple layers of glass for better imaging effects.
Color filter: The objects seen by the human eye are in the visible light spectrum, while the light spectrum that the image sensor can identify is much larger than that of the human eye. Therefore, a color filter is added to filter out the excess light spectrum, allowing the image sensor to capture the actual scene as seen by the human eye.
Image COMS sensor chip: The imaging medium converts the projected image (light signal) on its surface into an electrical signal.
PCB substrate: Transmits the electrical signals from the image sensor to the backend. For in-vehicle cameras, the PCB will have more circuits to convert parallel camera signals into serial transmission, enhancing anti-interference capability.
The working principle of the camera module is that the lens gathers light, then the IR filter removes unnecessary infrared light. At this point, the analog signal enters the sensor COMS chip, which outputs a digital signal. Some setups may include an ISP image processing chip on the camera side to process the signals before transmitting them to the host. In some cases, the ISP chip is not included in the camera, and the built-in ISP chip on the host performs image processing, improving heat dissipation and reducing radiation.
6. Structure of In-Vehicle Cameras:
The structure of in-vehicle cameras is shown in the image above. If the camera is placed outside the vehicle, it must be a complete camera assembly. If it is an in-vehicle DVR, waterproofing is not a concern, allowing for the assembly of the camera module as shown above.
The image sensor CCD and COMS differences:
CCD (Charge Coupled Device) is mainly made of silicon semiconductor, and its basic principle is similar to the solar cells on CASIO calculators, through the photoelectric effect, the photosensitive component senses the incoming light and converts it to the ability to store charges. Simply put, when the CCD surface receives light from the lens when the shutter opens, it converts the light energy into charges. The stronger the light, the more charges are produced, and these charges become the basis for determining the strength of the light.
CMOS (Complementary Metal-Oxide Semiconductor) is mainly made of silicon and germanium semiconductors, where both N (negative) and P (positive) types of semiconductors coexist, and the current generated by these two complementary effects can be recorded and interpreted by processing chips.
The biggest difference between CCD and CMOS lies in the position and number of amplifiers. Comparing the structures of CCD and CMOS, the position and number of amplifiers are the most significant differences.
Every time CCD is exposed, after the shutter closes or the internal frequency automatically disconnects (electronic shutter), it processes pixel transfer, sequentially transferring the charge signals of each pixel in each row into a “buffer” (charge storage device), which is then guided by the bottom line to output to the amplifier next to the CCD for amplification and then serially connected to the ADC (Analog to Digital Converter) for output.
In the design of CMOS, each pixel is directly connected to an amplifier, allowing the photoelectric signal to be amplified directly and then moved to the ADC for conversion into digital data. Due to the differences in construction, CCD and CMOS perform differently in terms of performance. CCD’s feature is to maintain signal integrity during transmission (exclusive channel design), while each pixel is collected to a single amplifier for unified processing, maintaining data integrity. CMOS processes are simpler, lacking exclusive channel designs, thus requiring amplification before integrating the data from each pixel.
1. Sensitivity Differences: Because each pixel in CMOS includes an amplifier and A/D conversion circuit, too many additional devices compress the light-sensitive area of a single pixel, so for the same pixel size, the sensitivity of CMOS will be lower than that of CCD.
2. Resolution Differences: In the first point of