In this paper, we analyze in detail, based on numerical modeling, the use of a Lyot filter to suppress the semiconductor optical amplifier (SOA) pattern effect. By formulating a robust design strategy, which is based on defining appropriate figures of merit and making the necessary tradeoffs between them, the filter performance can be optimized in terms of the wavelength spacing and detuning of its spectral response. The results obtained from this design procedure agree with experiment and enable us to accurately quantify to what degree the Lyot filter can resolve the SOA pattern effect problem.