American Institute of Physics, Applied Physics Letters, 19(109), p. 193503
DOI: 10.1063/1.4966999
Full text: Unavailable
The channel temperature (Tch) and thermal resistance (Rth) of Ga2O3 metal-oxide-semiconductor field-effect transistors were investigated through electrical measurements complemented by electrothermal device simulations that incorporated experimental Ga2O3 thermal parameters. The analysis technique was based on a comparison between DC and pulsed drain currents (IDS) at known applied biases, where negligible self-heating under pulsed conditions enabled approximation of Tch to the ambient temperature (Tamb) and hence correlation of IDS to Tch. Validation of the device model was achieved through calibration against the DC data. The experimental Tch was in good agreement with simulations for Tamb between 20 °C and 175 °C. A large Rth of 48 mm·K/W thus extracted at room temperature highlights the value of thermal analysis for understanding the degradation mechanisms and improving the reliability of Ga2O3 power devices.