Dissemin is shutting down on January 1st, 2025

Published in

American Institute of Physics, Applied Physics Letters, 19(109), p. 193503

DOI: 10.1063/1.4966999

Links

Tools

Export citation

Search in Google Scholar

Characterization of channel temperature in Ga2O3 metal-oxide-semiconductor field-effect transistors by electrical measurements and thermal modeling

This paper was not found in any repository, but could be made available legally by the author.
This paper was not found in any repository, but could be made available legally by the author.

Full text: Unavailable

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Orange circle
Published version: archiving restricted
Data provided by SHERPA/RoMEO

Abstract

The channel temperature (Tch) and thermal resistance (Rth) of Ga2O3 metal-oxide-semiconductor field-effect transistors were investigated through electrical measurements complemented by electrothermal device simulations that incorporated experimental Ga2O3 thermal parameters. The analysis technique was based on a comparison between DC and pulsed drain currents (IDS) at known applied biases, where negligible self-heating under pulsed conditions enabled approximation of Tch to the ambient temperature (Tamb) and hence correlation of IDS to Tch. Validation of the device model was achieved through calibration against the DC data. The experimental Tch was in good agreement with simulations for Tamb between 20 °C and 175 °C. A large Rth of 48 mm·K/W thus extracted at room temperature highlights the value of thermal analysis for understanding the degradation mechanisms and improving the reliability of Ga2O3 power devices.