Re: the Eagle Nebula dubbed the Pillars of Creation
And that's how most astronomical images were and are taken - even when plate and films were used before. Specific emulsion with specific wavelength sensibility were used, with filters if needed. Color films weren't good to collect scientific data.
CCD/CMOS sensors are only sensitive to photons energy, not their wavelength. The day we can achieve it will be a big breakthrough.
Bayer arrays as used in cameras are not the right solution for scientific imaging - where each pixel can be important.
As these objects doesn't move quickly, it's far better to use a plain sensor and put in front of it the filters needed for a given observation - it will give far more precise data. WFC3 has a lot of filters: http://www.stsci.edu/hst/wfc3/ins_performance/ground/components/filters
Then, if you need it you can reconstruct the color image knowing the filter wavelengths - actually, your camera still takes three different BW images through its Bayer filter and the "demosaicing" process does exactly the same process - although Bayer filters have more "green" filters than blue and red, and each BW image doesn't have the same pixels, so each resulting image pixel is an interpolation.
Depending on the filter used, you can create an image like the human eyes would see it, or with specific colors to highlight details useful for scientific investigations. After all, our eye shows just one possible representation of reality too - using a "technology" useful for surviving on Earth, not to investigate the Universe.
Moreover, even monopack color films are three layers of BW images - when developed the colors are added to each layer. But even Technicolor used a beam splitter and filters to record images on three different BW films separately, then "printed" on a single film.
So, whatever color image you see is something processed from a BW image adding colors while processing it.