I am just trying to figure out how calibrating between AC and DC can be done. The decoder is set up to some sort of speed table initially, so I guess the only thing left is how does the DCC side get calibrated to the DC side? From the looks of it, the decoder that draws the highest voltage would be the one that defines the rest in the fleet to follow. That would make my new GP40P-2 as the default for all others, which would mean tweaking the r-t-r decoders to start at notch one around seven volts or so. I guess it takes a serious voltmeter for starters to see how each one performs.