SCIENCE

NASA develops 36-pixel sensor - and can measure the energy of each pixel

The sensor will be placed in the 'Resolve' camera and provide additional data about what it photographs.

36 pixels might sound extremely few, considering that we almost automatically read "megapixels" and not pixels, when we report about cameras. But in this case, it is not a typo, but rather NASA's engineers who are in full swing developing an advanced sensor that can add an additional dimension of data to its images.

Prototype of the sensor.

With an area of 6 x 6 square pixels approximately the same size as the sensor area of an iPhone 15 and 15 Plus - that is around 8 x 5 mm, each sensor element should be able to capture more data about the particles that hit the sensor. To obtain as detailed information as possible about the particles that hit the sensor, the area is required.

The sensor is set to become the most important part of the camera "Resolve" in the XRISM project which will be able to detect X-ray radiation that has more energy than visible light, and thereby be able to reveal which substances the radiation comes from. By having the sensor's different elements measure energy and temperature so precisely, it can detect gas flows in, for example, a galaxy cluster far away from Earth.

The downside of the sensor is that it needs to operate in the cold - around -273.1 degrees Celsius (or 0.05 degrees above absolute zero) to function.