I remember seeing an interesting Audio Engineering Society's presentation (2005) [0] on a similar problem in balanced audio interfaces. Interestingly, an old-school audio transformer is more robust, it has higher CMRR in the real world when there's some common-mode impedance imbalance in the system, on the other hand the CMRR of an opamp seriously degrades. Designs which naively rely on the opamp CMRR were responsible for many noise problems in balanced audio.
> Where Did We Go Wrong? TRANSFORMERS were essential elements of EVERY balanced interface 50 years ago ... High noise rejection was taken for granted but very few engineers understood why it worked. Differential amplifiers, cheap and simple, began replacing audio transformers by 1970. Equipment specs promised high CMRR, but noise problems in real-world systems became more widespread than ever before ...Reputation of balanced interfaces began to tarnish and “pin 1” problems also started to appear!
> Why Transformers are Better. Typical “active” input stage common-mode impedances are 5 kΩ to 50 kΩ at 60 Hz. Widely used SSM-2141 IC loses 25 dB of CMRR with a source imbalance of only 1 Ω. Typical transformer input common-mode impedances are about 50 MΩ @ 60 Hz. Makes them 1,000 times more tolerant of source imbalances – full CMRR with any real-world source.
> CMRR and Testing. Noise rejection in a real interface depends on how driver, cable, and receiver interact. Traditional CMRR measurements ignore the effects of driver and cable impedances! Like most such tests, the previous IEC version “tweaked” driver impedances to zero imbalance. IEC recognized in 1999 that the results of this test did not correlate to performance in real systems... My realistic method became “IEC Standard 60268-3, Sound System Equipment - Part 3: Amplifiers” in 2000. The latest generation Audio Precision analyzers, APx520/521/525/526, support this CMRR test!
> Where Did We Go Wrong? TRANSFORMERS were essential elements of EVERY balanced interface 50 years ago ... High noise rejection was taken for granted but very few engineers understood why it worked. Differential amplifiers, cheap and simple, began replacing audio transformers by 1970. Equipment specs promised high CMRR, but noise problems in real-world systems became more widespread than ever before ...Reputation of balanced interfaces began to tarnish and “pin 1” problems also started to appear!
> Why Transformers are Better. Typical “active” input stage common-mode impedances are 5 kΩ to 50 kΩ at 60 Hz. Widely used SSM-2141 IC loses 25 dB of CMRR with a source imbalance of only 1 Ω. Typical transformer input common-mode impedances are about 50 MΩ @ 60 Hz. Makes them 1,000 times more tolerant of source imbalances – full CMRR with any real-world source.
> CMRR and Testing. Noise rejection in a real interface depends on how driver, cable, and receiver interact. Traditional CMRR measurements ignore the effects of driver and cable impedances! Like most such tests, the previous IEC version “tweaked” driver impedances to zero imbalance. IEC recognized in 1999 that the results of this test did not correlate to performance in real systems... My realistic method became “IEC Standard 60268-3, Sound System Equipment - Part 3: Amplifiers” in 2000. The latest generation Audio Precision analyzers, APx520/521/525/526, support this CMRR test!
[0] https://www.aes-media.org/sections/pnw/pnwrecaps/2005/whitlo...