In today’s sensor designs, a dye filter is placed on top of a photodetector to measure the intensity of light. However, the dye filters cannot be made too small or they become ineffective in filtering colors. When the pixel size is less than 1μm, the pixels also begin to experience harmful electromagnetic interference.
With the goal of improving image sensors in mind, my research at Berkeley was aimed at uncovering the methods and materials from which image sensors can be created from. Working in Prof. Zhang’s department, I researched sensor development and its applications to learn more about how image sensors can be decreased in size through nanotechnology.
Here, I was able to help develop a sensor that addresses the problem of interference and eliminates the need to use a dye filter through the creation of a different type of sensor that makes use of silicon’s light absorbance properties. In this design, different sizes of silicon nanostructures are constructed and placed together to form a new kind of image detector. As a whole, this design should yield a pixel size that is five times smaller than that of the world’s best camera today, increasing image resolution by 25 times and providing other benefits in terms of color accuracy, low light performance, and capture speed.