posted on 2022-09-27, 18:04authored byZhenxuan Zhao, Jianshi Tang, Jian Yuan, Yijun Li, Yuan Dai, Jian Yao, Qingtian Zhang, Sanchuan Ding, Tingyu Li, Ruirui Zhang, Yu Zheng, Zhengyou Zhang, Song Qiu, Qingwen Li, Bin Gao, Ning Deng, He Qian, Fei Xing, Zheng You, Huaqiang Wu
In the long pursuit of smart robotics, it has been envisioned
to
empower robots with human-like senses, especially vision and touch.
While tremendous progress has been made in image sensors and computer
vision over the past decades, tactile sense abilities are lagging
behind due to the lack of large-scale flexible tactile sensor array
with high sensitivity, high spatial resolution, and fast response.
In this work, we have demonstrated a 64 × 64 flexible tactile
sensor array with a record-high spatial resolution of 0.9 mm (equivalently
28.2 pixels per inch) by integrating a high-performance piezoresistive
film (PRF) with a large-area active matrix of carbon nanotube thin-film
transistors. PRF with self-formed microstructures exhibited high pressure-sensitivity
of ∼385 kPa–1 for multi-walled carbon nanotubes
concentration of 6%, while the 14% one exhibited fast response time
of ∼3 ms, good linearity, broad detection range beyond 1400
kPa, and excellent cyclability over 3000 cycles. Using this fully
integrated tactile sensor array, the footprint maps of an artificial
honeybee were clearly identified. Furthermore, we hardware-implemented
a smart tactile system by integrating the PRF-based sensor array with
a memristor-based computing-in-memory chip to record and recognize
handwritten digits and Chinese calligraphy, achieving high classification
accuracies of 98.8% and 97.3% in hardware, respectively. The integration
of sensor networks with deep learning hardware may enable edge or
near-sensor computing with significantly reduced power consumption
and latency. Our work could empower the building of large-scale intelligent
sensor networks for next-generation smart robotics.