Re: But why would it show a consistent decline over 100 years?
I can't buy the idea that the apparent reduction in light, over time, from Tabby's Star is because "The problem with using a hundred years' of observations, the group argue, is that the source data from “Digital Access to a Sky Century @ Harvard includes half a million glass plates shot between 1885 and 1993, using a number of different instruments and cameras."
Going even further than MartinG, I'd expect the SOP would be to only consider plates that include sufficient additional stars to allow the entire plate to be calibrated, comparing every star (and galaxy) in the plate being calibrated with every other plate/image in which any of those stars and galaxies appear.
This wouldn't eliminate the problem with variations in the sensitivity of the emulsion across each plate but as the plates would have been prepared specifically for scientific measurement purposes I'd expect the 'noise' variability across each plate to be pretty low, and certainly way below 20%, which the team seems to think is a typical noise level (is 20% noise even science?).
Fwiw, Kepler's noise floor appears to be around 80 ppm, whilst two of the measured variations in brightness from Tabby's Star, in recent times by Kepler, were by 15% & 22%.