It seems that since Leonardo de Vinci first placed his mental concept of a flying machine on paper we aviators have had a hard time separating fact from fiction. Aviation, not unlike most other human endeavors, has its share of commonly stated “truths” the scientific data doesnt support. Lets call them aviation myths. Why then, do they continue? One reality is that we are all busy in our daily lives and digging through reams of material to verify what someone else says is just too time consuming not to mention boring. Another issue is that we have learned that those who have gone before us are generally a reliable source of information on how not to get ones name included in an NTSB report much less looking stupid in that report. Its an understandable shortcoming. Lets take a look at a few of the commonly held aviation concepts related to engine operation for which there is no supporting data, contradicting data or, at the very least, data calling them into question. High RPM increase wear This one looks like it should be true. In a way it is, but lets look more closely. A thin film of oil must separate metal parts inside an engine or they wont be running for very long. In literally seconds, metal-to-metal contact will result in catastrophic failure. So, where does this increased wear really come from and is it something we should be concerned about? Does oil wear metal? Sure, maybe over a thousand years! How else did David get that smooth, slick river rock to sling at Goliath? But, does it matter to my 1700-hour TBO engine? In the grand scheme of operational concerns, there are many other things to worry about before questioning whether high rpm causes abnormally high engine wear. This is at least a third-order effect and there seems to be no reason to reduce rpm for that concern. That said, other reasons to use lower rpm are supported by data. For example, lower rpm increase propeller efficiency and improve the process of horsepower getting turned into thrust. Thats like getting free horsepower. Lower rpm are quieter. Lower rpm result in reduced frictional losses. Lower rpm produce less noise and in an aviation-unfriendly environment can help our cause. A corollary to this myth is that reducing rpm shortly after takeoff is always a good idea. The short answer is “no.” I use maximum rpm for takeoff and climb as a matter of routine because it results in more power and less stress on the cylinders while providing maximum mass airflow which equates to power when operating rich of peak EGT (ROP) through the engine. Higher rpm result in lower internal cylinder pressures, too. The real question should be, “Are the lower, third-order negative-wear effects of high rpm offset by the increases in stress on the cylinders?” The data suggest they are. Even though using max rpm during takeoff and climb to altitude tends to reduce over-all engine stress, Im willing to place a bit more stress on the cylinders at the lower RPM to be a good neighbor. Reducing prop speed by about 100 rpm right after takeoff for noise abatement up to about 1000-1500 feet (some prefer 5000 feet), then returning to max rpm for the rest of the climb addresses both concerns and is compatible with the science of noise and cylinder pressures. Bottom line? Use the rpm setting that you need and dont worry about wear. High mp stresses the engine This is another one that is partially true but really depends on a number of other factors. High manifold pressure (mp) is, in and of itself, not dangerous or stress-inducing for the engine. Any concerns about high mp should be more about the other factors acting in concert with manifold pressure and their effects on internal cylinder pressures. Yes, mp matters, but in what way? Remember, 29 in. Hg is roughly what your engine experiences parked in the hangar! Increased mp is part of an equation involving mass airflow; as mentioned, this is what determines hp production when rich of peak. There is a “top-of-the-green” mp limitation the manufacturer has placed on the engine to help us keep it from exceeding internal cylinder pressures that could be detrimental to getting to TBO. Increasing mp does increase the internal cylinder pressures because, historically, that meant more hp, if all else is equal. Seldom, however, is all else equal. With more and more pilots operating lean of peak EGT (LOP), engines are seeing lower internal cylinder pressures than they have seen ROP. What this means is that the manufacturers internal cylinder pressure limit is not reached at the same mp as when ROP. Lets look at how this happens. Suppose we are flying a TCM IO-550 variant ROP at 23 in. Hg and 2300 RPM. This power setting produces 69 percent of rated power approximately 207 hp is being produced. Assuming we can get a higher mp in the same engine at the same altitude, we can produce the same 207 hp on 25 or 26 inches and 2300 RPM, LOP, on 13.9 gph. We know from the measured data on internal cylinder pressure that, in the latter example with the higher mp, internal cylinder pressure is lower than when ROP at the same power. Obviously, saying that higher mp produces more stress on the engine is not always correct. Ironically, higher mp may be less stress than a lower mp, depending on those other factors. To quote someone with more hours in more types of aircraft than many of us can ever hope to have “Old Bob” Siegfried “It all depends.” Turbo cool-down No one seems to know for sure where the idea came from that sitting at idle for any period of time cools off a turbocharger after a flight particularly one housed under a tight cowl. Instead, there is compelling evidence that the coolest a turbo ever gets is after a low-power approach, something that happens about the time the aircraft touches down and turns off the runway. So far, no one has offered any data that suggests otherwise. Measured oil temperature is lower after a long descent and landing and will frequently heat up as one sits on the ramp at idle with no cooling airflow through the oil cooler. This myth just makes no sense but the conventional wisdom on the need for a turbo cool-down period to allow oil in the turbo to cool down and prevent “coking” lives on unsupported by any data Ive seen. Turbocharged aircraft were flown for many years before this was ever heard of with no ill effects from turning off the runway, taxiing straight to a nearby hangar and immediately shutting down. I have seen no data to suggest that turbo failures were more of a problem then than they are now which is very low anyway. Based on my watching my own engine, sitting at idle heats things back up after a landing. Ive come to call this the “turbo heat-up period” and I dont do it. The biggest reason not to adhere to the conventional wisdom on turbo cool-downs is that you have to wait five extra minutes to go pee. That could be serious! Shock cooling Wanna start a brawl at the local airport? Mention that shock cooling is a myth. If you do, be ready to duck and cover. We do have a guess as to where this myth came from and how it got started. When twin Cessnas were first introduced, pilots began flying higher for longer periods of time and fuel became cold-soaked at altitude. After a rapid descent, their pilots entered the pattern and followed the POH recommendation to go full rich as part of the pre-landing checklist. A very cold, large amount of fuel would blast onto the warm intake chamber wall; thats where they were cracking. It was a valid observation that was tagged with a misplaced solution. Reducing mp two in. per five minutes has no effect on this issue whatsoever except that you have to start the let-down many miles earlier than you would otherwise. Besides, how can you shock-cool something thats not hot to start with? Manufacturers have long recommended keeping CHTs under about 400 deg. F in cruise because doing so has a positive effect on durability. Collected data highly support that idea; many are more conservative and adopt 380 deg. F as a better “red” line. There is compelling evidence that if one has CHTs under about 420 deg. F that its just not possible to shock cool anything. Repeatable data from my engine monitor shows that if I immediately pull the power from 85 percent power in cruise to 50 percent power and descend at 1000 fpm at the top of the green arc, the cylinders cool from a range of 340-375 deg. F to 290-300 deg. over about 10 minutes. Thats a dramatic power change by most pilots standards. Once at that temperature, they remain stable until the approach where they slowly climb a few degrees as I slow down and tool around. The cool-down rate is often no more than 30 deg. F per minute initially. Why would one want to mess around with a minimal, timed power reduction that increases workload prior to an approach, especially when descending into hotter summer air earlier and ending up with the same CHTs? Even in the face of mechanics who claim that some problem was caused by shock cooling, I have yet to see any hard data suggesting those not worrying about shock cooling have any more cylinder problems than those who are meticulously managing mp reductions and remain very concerned that they can in any way shock cool an engine thats not hot to start with. Never lean above 75 percent When a clean sheet of paper engine design is undertaken, one of the first things to be decided by the engineer is what maximum internal cylinder pressures can be expected for long-term durability. Red lines and operating parameters are then established based on maintaining control over these pressure limits. The engine is rated at full-rich, takeoff power and leaning it from that mixture results in more than rated power and higher than designed internal cylinder pressures. An engine at a reduced power below 75 percent and leaned will have a better chance of staying in one piece, and the one-size-fits-all recommendation not to lean above 75 percent power is not a bad one. Heres the rub: “It all depends” applies here as well. Historically, these recommendations from Lycoming and TCM were based on ROP operation, primarily because the engine as originally delivered was incapable of operating LOP. In some cases, it wouldnt run at peak EGT without significant roughness. That problem left operators with the ROP-only side of the mixture spectrum. With the advent in 1998 of GAMIjector fuel injectors and the subsequent increase in LOP operations due to balanced fuel/air ratios, the game has changed operationally and scientifically. Measured data show that an engine running at 85 percent power appropriately LOP is operating under significantly less internal cylinder pressure stress than one operating at 75 percent power at 50 deg. F ROP, per long-standing OEM recommendations. This procedure is well within the OEM original concerns of managing the ICPs, and makes the “no leaning above 75 percent power” recommendation incomplete. Cirrus and Columbia have led the way as far as OEMs are concerned. But the old boilerplate recommendations have not kept pace until today. Kudos to Tornado Alley Turbo and Cirrus. It really feels good to know that some people are not willing to simply copy the old boilerplate wording, and are willing to update their recommendations as the engine management world expands its knowledge, capability and understanding. Do I think its critical that everyone change their way of doing things? No. The idea here is for us think a little about why we do what we do and re-examine the realities of what some call aviation myths.