Tim,
Unfortunately I don't have the time to read all of the extensive followups in here but for scientific purposes sensor performance should and is evaluated at pixel level, anything else is just a waste of time. when we measure Quantum Efficiency of the sensor in the lab we look at individual pixels to measure internal and external quantum efficiencies, the total area of the sensor does not matter at all, for example if you have a large sensor but with small pixels versus a small sensor with large pixels the image coming from the small sensor with large pixels will have lower per-pixel noise and will look cleaner albeit it will have lower resolution. It all depends on pixel area not sensor area, you can take 25 small sensors from a regular digicam and stitch them together to make a huge sensor but IQ will not be any better :)
One pixel is one pixel and when you do measurements one pixel on sensor should be one pixel on the screen. By the laws of quantum mechanics, everything being equal a smaller pixel will always have higher shot noise than the larger pixel thus a 50D image always has higher noise than a 40D image at 1:1 size, there is no magnification whatsoever.
P.S. If you have access to technical journals and want to find out how we measure QE see this excellent articale published in several IEEE journals, unfortunately I cannot post any of the content due to copyright issues.
A Method for Estimating Quantum Efficiency for CMOS Image
Sensors
Boyd Fowler, Abbas El Gamal, David Yang, and Hui Tian
Information Systems Laboratory, Stanford University