Can you guys expand on this a little more, as perhaps I missed something. I was under the impression that the only difference between two identical engines running 30 pounds of boost pressure, with two different sized turbo's (or single vs twins) would be the denser air charge of the larger (or extra) turbo(s). I would agree with Donnie in that boost pressure is an irrelevant reading, it's just telling you how hard that particular turbo(s) is working to achieve that amount of pressure. How is backpressure used in the equation as being the sole factor in making more power? If you compare two turbo's with one equally matched turbo (exhaust housing, flow ability, MAP, etc.), run two 2 1/2" downpipes on the twins, and a 5" downpipe on the single, how exactly do the twins make more peak power at a given RPM (note that twins will obviously make more power under the curve)? Please explain, as I am very curious....
Good question Lethal, and Fryguy brings most of the answer: efficency, one large compressor wheel/housing will leak"backflow" much less than two smaller units causing less temperature rise of the intake charge after compression, this is how compressor maps are graphed.
Any of us playing with "Boost" has seen the results(or lack thereof) of cranking up boost and to a point it's all fun and games, then the wall of diminishing returns is found, when dealing with turbos that wall might be the compressor side on some and the exhaust on others.
The easiest way to diagnoiss which side is helping or hurting you is to measure temperature output of the air charge out of the compressor(before intercooler) you can then "connect the dots" on a compressor map to "See" where you are playing.
Measuring the exhaust side is much simpler with pressure measured before the turbine and after(normally atmospheric unless a restriction exists)
For any compressor wheel there will a necessary shaft rpm to achieve "Boost", this rpm is supplied by the exhaust wheel (still with me?)
If the compressor requires more shaft torque (speed) to aquire compression than the turbine can supply/transmit to shaft, exhaust pressure(pre-turbine) will need to increase in order to overcome the resistance of the compressor trying to do it's job, this is expressed in backpressure(pre-turbine)
Anytime exhaust pressure is higher than intake pressure there will be a natural but unwanted "EGR" condition.
Many accept their EGR penalty by changing valve timing (overlap) to accomodate this contamination, while it lessons the contamination it severely hurts the engines natural breathing throughout it's operating range.
In my experience, most turbocharger combinations (compressor side vs turbine side) are out of wack, way too much compressor for the turbine driving it.
I'll wager that many here do not understand how this contamination seriously affects tuning in reguards to fuel and timing requirements,( A conversation for another day)
If someone here has a running car and a temperature probe leaving the turbo and a pressure probe before turbine and the paitence to understand this, I would be more than willing to lay out the data in a seperate comprehensive thread.
Good luck,
Kevin.