Dissemin is shutting down on January 1st, 2025

Published in

Wiley Open Access, Advanced Science, 15(11), 2024

DOI: 10.1002/advs.202303403

Links

Tools

Export citation

Search in Google Scholar

Closely Packed Stretchable Ultrasound Array Fabricated with Surface Charge Engineering for Contactless Gesture and Materials Detection

This paper is made freely available by the publisher.
This paper is made freely available by the publisher.

Full text: Download

Green circle
Preprint: archiving allowed
Green circle
Postprint: archiving allowed
Green circle
Published version: archiving allowed
Data provided by SHERPA/RoMEO

Abstract

AbstractCommunication with hand gestures plays a significant role in human‐computer interaction by providing an intuitive and natural way for humans to communicate with machines. Ultrasound‐based devices have shown promising results in contactless hand gesture recognition without requiring physical contact. However, it is challenging to fabricate a densely packed wearable ultrasound array. Here, a stretchable ultrasound array is demonstrated with closely packed transducer elements fabricated using surface charge engineering between pre‐charged 1–3 Lead Zirconate Titanate (PZT) composite and thin polyimide film without using a microscope. The array exhibits excellent ultrasound properties with a wide bandwidth (≈57.1%) and high electromechanical coefficient (≈0.75). The ultrasound array can decipher gestures up to 10 cm in distance by using a contactless triboelectric module and identify materials from the time constant of the exponentially decaying impedance based on their triboelectric properties by utilizing the electrostatic induction phase. The newly proposed metric of the areal‐time constant is material‐specific and decreases monotonically from a highly positive human body (1.13 m2 s) to negatively charged polydimethylsiloxane (PDMS) (0.02 m2 s) in the triboelectric series. The capability of the closely packed ultrasound array to detect material along with hand gesture interpretation provides an additional dimension in the next‐generation human‐robot interaction.