Yes, "stretched" means the female part is too large in diameter. That comes from cleaning off excessive corrosion or from stuffing a meter probe in too far that is larger in diameter than the male part of the terminal. Less-obvious is a pushed-out terminal. Service manuals always list that as the first thing to look for, especially when there might be a plug that was recently disconnected for something else. In over thirty years of doing auto electrical work, I only ran into that once myself, and a friend ran into it once a few months ago. It is not that common, but do not overlook the possibility.
When there is a break in the signal wire circuit, whether it be a cut wire, or corroded terminal, since the wire inside the computer is tied to a lot of other circuitry, the voltage inside the computer on that line can "float" to some random value. For explanation purposes, the acceptable range of signal voltage from a sensor is 0.5 to 4.5 volts. Anything outside that range is what triggers a fault code. When the signal voltage is allowed to float, if it happens to stay within that acceptable range, the computer will regard that as a proper voltage, and it will try to run the engine on that. To prevent that, all signal circuits have an internal "pull-up" or "pull-down" resistor tied to 5.0 volts or to ground. Those resistors are so big electrically that they have no effect what-so-ever on a properly-working circuit, but when the circuit is open, (cut wire), they will put 0.0 or 5.0 volts there to force it to go to a bad state and set a fault code. Since you found 0.5 volts, that tells us you have a pull-down resistor, and since you don't have 0.0 volts, the wire can be assumed to be not open or grounded.
When a code is set related to the MAF sensor, the computer knows it cannot rely on the signal voltage, so it has two back-up strategies. One is to run on the MAP sensor. Chrysler is the only manufacturer that has consistently been able to make their engines run right with only a MAP sensor. All other manufacturers use a MAF sensor on all or the majority of their engines. MAP sensors are so sensitive that they could be used to measure engine speed because they can detect the tiny little extra vacuum pulse each time a piston takes a gulp of air. That is why a lot of engines can run on them when the MAF sensor has failed. The clue is they wont run right under all conditions because the computers are not designed or programmed for that to be the primary fuel metering calculation component, and that may be what you have.
The second strategy is most Engine Computers have the ability to "inject" an approximate value for a sensor that has been detected as having a defect, and run on that. For example, you know if an engine has been running for twenty minutes, the coolant temperature cannot really be minus forty degrees, but that is what would be read if the sensor was unplugged. The computer also knows that if the engine is running at 1,000 rpm's, there has to be air flowing into the engine, and there has to be fuel commanded from the injectors to go with it. There is one clinker I have to add for people researching this topic. There was a software problem with some GM scanners where they would display that injected sensor value instead of the actual value coming from the sensor. That made it look like the sensor was reading correctly, and there was no reason for the fault code to be setting. Normally we do not have customers' time and money to waste checking voltages, but in a case like this, when the scanner is displaying a value that does not agree with the fault code or the engine's operating condition, that is the time to take a reading right at the sensor, and compare it to the scanner's reading. Normally the computer will be running the engine on the injected value, but it will be telling the scanner the actual sensor value. The first time you run into this, you will be scratching your head until you are bald!
Going further without a scanner is going to be difficult. Also, I do not know if your signal voltage is supposed to go up toward 4.5 volts or down toward 0.5 volts when engine speed increases. Most MAP sensors' voltages go down as vacuum goes up, but that is not relevant to this sad story. On top of that, the service manual only says the scanner will list MAF as grams per second, and that it should go up with increased engine speed. I know what a volt is. I do not know what a gram looks like. Chrysler's DRB3 lists sensors with their signal voltages and with the value that it is computed to represent. In other words, the coolant temperature sensor signal voltage might be 3.5 volts, but it will also be listed as 182 degrees, for example. Almost all of the more expensive scanners work the same way.
Normally professionals do these electrical tests before condemning a sensor, like you did, then they order one if appropriate, like you did. This is one of those frustrating times when proper procedure didn't yield the expected results. Assuming we can trust the sensor is good, you might consider unplugging it, then using four jumper wires to connect the wires to the sensor. If you still get 0.5 volts on the signal wire, pop that signal jumper wire off, then measure right on the sensor's terminal. If the voltage comes up to normal while the engine is running, suspect that signal wire is grounded. The 0.5 volts you found can be explained by the meter picking up magnetic interference. You can prove that to yourself too by just setting the meter on the running engine without the probes connected to anything.
If you should find the signal voltage is correct and responding normally, but you still have a hesitation problem, fuel pressure is the next thing to look at. From driving around with a fuel pressure gauge attached to my old 1988 Grand Caravan for a year, I know that normal pressure is around 50 pounds, and the engine runs fine down to twenty pounds. That is unusual. A lot of engines develop hesitations and / or failure to start if fuel pressure is only four or five pounds too low.
There is one more parting comment about digital voltmeters. In my forty years as a television repairman and thirty years as a mechanic and instructor, I have accumulated over a dozen digital meters, but I refuse to own one with the "auto-ranging" feature. I have used them, and often went down the wrong path from failing to notice that the meter changed ranges on me. I have found "15 volts", which is good, when in reality, I had 0.15 volts, which in effect, is 0 volts, and that is not good. If you are using an auto-ranging meter, you wont be the first person to be fooled. Somewhat related to that, if you are on a scale that is too high. Say the "200-volt" scale, there is very little difference between 0.5 and 4.5 volts. All meters have a tolerance, meaning the amount they can be off, and that typically includes a percentage of the voltage being read, and one to three steps in the last digit. That means a reading of 0.5 on the 200-volt scale could actually be 2.8 volts. One percent of 200 volts is two volts, and three numbers higher on the last digit comes to 0.8. A meter could read only 0.5 volts when measuring 2.8 volts, and still be considered to be in calibration. To prevent that error, always use the lowest scale possible that does not send the meter into an over-range condition.
Sunday, October 23rd, 2016 AT 5:21 PM