摘要:Abstract In this paper, the effect of sampling period on several well-established optimal ratio-type fault detection performance indices in sampled-data systems has been examined. It provides insight into the intrinsic properties of different optimal fault detection problems and supports the selection of a suitable sampling period. The result is theoretically interesting because the variation of the optimal H-/H∞ index with respect to the sampling period is contrary to the generally held view. As shown in the paper, if the sampling period is increased by an integer multiple, the optimal parity space index and the optimal H2/H2 index will degrade, but the optimal H-/H∞ index will become better.