I'm having a hard time realizing the depth of discussion on this topic. Maybe I'm looking at it to simply. In my experience it all boiled down to what I was using to test the output. I know on A-B output cards, if I use my fluke on an output which is off, I read a 120V. If I use my wiggy, which is my standard practice, I read 0V.
If I used a cheap Walmart meter, I would probably read about 70V. The point is, the lower the internal resistance of the testing device(wiggy), the more load, the less likely you will get fooled by leakage. The higher the internal resistance of the testing device(fluke,which is megohms), the more likely you will get fooled by leakage. But the higher the resistance of the meter, the more accurate it is. Just don't use them on output cards. As far as what resistance you need, use a wiggy, it's low enough.