The sensitivity afforded by quantum sensors is limited by decoherence. Quantum error correction (QEC) can enhance sensitivity by suppressing decoherence, but it has a side effect: it biases a sensor's output in realistic settings. If unaccounted for, this bias can systematically reduce a sensor's performance in experiment, and also give misleading values for the minimum detectable signal in theory. We analyze this effect in the experimentally motivated setting of continuous-time QEC, showing both how one can remedy it, and how incorrect results can arise when one does not.