Ok, I've always been led to believe that CFM was more important a measure than psi and that the larger turbos flow grossly more air at a given psi than their smaller brethren. I understand the basic part that the hot air produced when the turbo is overworked leads to a higher psi reading from the same quantity of actual air.
Here's where I start to get confused though: if we keep the intake charge's temperature constant through effecient intercooling shouldn't these values be the same? Basicly if you pressurize the manifold to 20psi using 120 degree air from a 14b or the same temp and psi from say a mammoth BR580 shouldn't this be using the same cfm of air?
Right when I think it all makes sense I talk myself back into the classic fallacy of 'dude it's 20 psi!'. Is the Someone straighten me out here please.