
The dead time setting of a servo inverter affects the output voltage waveform quality, thus impacting the EMC spectrum. An excessively short dead time leads to bridge arm shoot-through, generating significant short-circuit current and broadband noise; an excessively long dead time results in output voltage distortion, increased low-frequency harmonic content, and potentially higher amplitudes in the low-frequency band of conducted interference (e.g., 150kHz-1MHz). Etymotics recommends that the dead time be precisely set based on the IGBT turn-off time and drive circuit delay, typically between 1-3μs.
Dead-time compensation algorithms can be used to reduce harmonics introduced by the dead time. From an EMC perspective, the dead time should be minimized while ensuring safety, and the switching waveform should be optimized using buffer circuits and gate resistors. Experimental results show that optimizing the dead time from 4μs to 2μs can reduce the THD of the output current by 2%, while simultaneously reducing the peak conducted interference at the corresponding switching frequency by 3-5dB. The dead time setting must ultimately be verified through both double-pulse testing and system EMC testing.