The European Space Agency has announced the completion of the camera that’s to be used in its Gaia mission: a billion-pixel mosaic comprising 106 individual CCDs in a 0.5x1 meter array. Assembled in May and June at Astrium’s facility in Toulouse, the camera is designed to map around a billion stars when Gaia’s five-year mission …
These corps make 1e9 transistor CPU chips. I don't know how big an individual element in a CCD is... could a billion pixel CCD be fabbed at 28 or 32 or 40 nM per pixel?
They could probably make it, but it would be shit, the CCD would capture no light, if you want a decent image, of faint objects a large CCD is required, otherwise you need to boost the light intensity, meaning you introduce noise.
The fundamental thing about CCD sensors is that as you make them smaller the light Wells into which the photons fall get smaller. Thus for each exposure, which is essentially an ADC sample of the charge accumulation in the well, you get less charge and a higher signal to noise. Pixels are one area where smaller is rarely better, best left to the specialists.
The problem with pixel size is that as you make them smaller, they receive less photons. That means the amount of signal reduces with the pixel size, but the noise doesn't (if anything, the noise increases as you need more gain to actually measure the number of photons). Generally for good signal to noise ratio, the bigger the pixel the better (within limits!)
It would be like cramming a 2 week holiday to Venice into 1 day by only staying in each tourist location for 1 minute, and travelling between them at 70MPH. Yeah, you'd fit it all in, but you'd miss a lot of the subtlety and the quality of your holiday might be a bit lower :)
Maybe, but the more pixels you put on a chip, the smaller they are and so the less light-sensitive they are. For astronomical use, you need large pixels.
One thing about the article - the camera itself is NOT stereoscopic in the classical sense of a pair of sensors taking images of the same view at the same time. How could it possibly detect the parallax shift of objects light years away if it did that? The answer is that it will image each area of the sky from the two extremes of the Earth's orbit, thus giving enough separation between the images for depth information to be recovered.
In a word no...
The minimum size of a photosensor element in a CCD is effectively dictated by the optics and the nature of visible light and not the ability to fabricate smaller elements. Indeed a 40nm element would be something like an order of magnitude smaller than the what any visible light optical instrument could theoretically resolve. Indeed for many types of optical instrument, the practical resolving power is significantly worse than that. Whilst there is some value in the sensor "out-resolving" the optics, there is a law of diminishing returns and there are other issues. One of these is the ability of the photosensors to be hold enough excited electrons to provide for a decent dynamic range. That ability is directly related to the photosensor size. Then there is a requirement to be able to read all these elements. A single billion-cell CCD would take a long time, and due to the "bucket-brigade" nature of a CCD it would be likely to introduce more errors and noise. A CCD also has to allow for space round the photocells for insulation and for circuitry. That reduces the surface area for actually sensing light, which is very bad news.
For this sort of work, the optimal photosensor size is probably of the order of a several microns, or a good two orders of magnitude larger than that used for producing processor chips.
I believe (my experience is with microscopes, not telescopes) that a major problem is minimizing noise -- achieved by operating the CCDs at low temperature. Does this mean that Gaia will be kept in the shadow; i.e. permanently in the night sky?
Thank you, all
Thanks for the explanations!
not without shrinking light
There isn't much point making a 28nm pixel when a visible photon is 500-1000nm
a visible photon is 500-1000nm
Are you referring to the effective diameter of a photon or the wavelength of human eye detectable radiation?
I for one am not looking forward to 2 hour uncompressed videos of astronomers cats falling off telescopes followed by a similar amount of data in the form of blog postings demanding broadband upgrades because the web has got too slow again...
Stereo View Of The Stars?
If you want a 3D image of stars, don't your 'eyes' have to be very, very far apart? If the two imaging arrays are set at 106.5 degrees to each other, aren't they then looking in very different directions?
Can the El Reg hive mind shed any (star) light on this please?
The title is required, and must contain letters and/or digits.
I imagine that you would take two pictures, 6 months apart. Then your "eyes" are at opposite sides of the Earth's orbit (2 AU).
If I'm reading this right...
The ESA say that the two telescopes are at a precise 106.5 degree angle to each other. This means they can use this as a reference point that shows where the imaged stars are _in relation to other imaged stars_. Using this data the ESA will produce a relatively accurate 3D map of star positions (and velocities) for a huge number of stars. It's not a design for stereoscopic imaging of single objects, but a method of fixing the location of objects in relation to other objects for the production of maps that can be displayed stereoscopically.
Of course, I may _not_ be reading this right...
(Sometimes I look at what humans can do with our technologies, and think "we are so cool!". Then I look to what we do to each other and other animals, and I despair. Is that just me?)
I may have missed something but I believe space is really, really big so unless the CCDs are seperated by a correspondingly large amount there isn't going to be much depth perception going on.
It looks like Thoguht with his 'Gigapixel?' post answered that one. I'm still surprised that there would be enough separation.
Yes they probably could fab a CCD. BUT
The noise factor between the individual pixels would more than likely make it unusable.
Then there is the post capture processing.
How the camera deals with the information that comes from the CCD is also important.
This is cool -
thanks to the Reg and Mr Chirgwin for picking it up !...
It will be followed by a suitable lens correction plugin for Lightroom?
"106 individual CCDs in a 0.5x1 meter array."
Or "metre" if you live in Europe.
metre vs meter
It is only 'metre' in non US english speaking countries. The continentals all call it 'meter'.
It's metre in France !
The Germans definitely call them Meters.
German is a DIFFERENT language to French
"Ein (1) Meter. Zehn (10) Meter."
No plural "s".
Low light levels ...
... need a larger collection area so that sufficient photons can be collected to provide reasonable signal/noise.
made by e2v Chlelmsford ...
Which tickled something in my head, and Google supplied the rest. e2v was once EEV and EEV was once the English Electric Valve company. A great example of how a company has to change to stay in business. History here http://www.alphatronlinac.com/default.asp?articleid=94
A Five Year Mission ....
... to boldly show what no-one has seen before.
Yeah, I'll get my coat.
Which L2 Lagrange point?
"Gaia will be stationed at the L2 Lagrange"
The L2 of the Earth-Sun system, or the L2 of the Earth-Moon system?
Actually... if it's only 1.5Mkm, I guess it must be the earth-moon.
Sun and Earth: 1,500,000 km (930,000 mi) from the Earth
Earth and Moon: 61,500 km (38,200 mi) from the Moon
Wikipedia, so the truthiness answer is Earth-Sun.:-
The Moon is only 4e5km away from the Earth. The L2 point can't be more than double the orbital distance.
The telescope won't be completely in the shade as the Earth's umbra only extends to a little less than 1.4e6km. So, the Earth will block most of the light from the Sun, but not all.
Stereoscopic and 3D
It's a bit more complicated.
The telescope is two barrels pointing 106 deg apart looking at different sets of stars.
Gradually as it moves each telescope will see each star a lot of times
So you measure the 2D XY pixel position of star1 on camera1 and star2 on camera2, Later you have star2 on camera1 and star3 on camera2
Eventually you have a few billion XY pixel measurements of 20million stars and then you do the mother of all simultaneous equations to get their relative 2D positions.
Then independently you work out the distance to each star (from movement, colour, brightness etc) - eh voila you have a 3D map. A bit like GPS - the lat/long position is incredibly good but the distance (height) is a lot worse.
Sounds awesome... will they get a free copy of Photoshop with it?
billion pixel porn
You know they have to do that before they shoot that thing into space. we all want to see it.
31,622x31,622 pixel close up crotch shot. Think of the detail!
But if it can't...
...do HD 1080 movies at 25 FPS I can't see a market for it and I certainly wouldn't buy one.
- Mounties always get their man: Heartbleed 'hacker', 19, CUFFED
- Samsung Galaxy S5 fingerprint scanner hacked in just 4 DAYS
- Feast your PUNY eyes on highest resolution phone display EVER
- AMD demos 'Berlin' Opteron, world's first heterogeneous system architecture server chip
- Analysis Oh no, Joe: WinPhone users already griping over 8.1 mega-update