Chrome Extension
WeChat Mini Program
Use on ChatGLM

Exploring the Impact of Domain Numbers on Negative Capacitance Effects in Ferroelectric Device-Circuit Co-Design

SOLID-STATE ELECTRONICS(2023)

Indian Inst Technol Roorkee

Cited 1|Views11
Abstract
This paper explores the performance of Hf0.5Zr0.5O2 (HZO)-stacked negative capacitance field-effect transistor (NCFET) for possible beyond complementary metal-oxide-semiconductor (CMOS) technology. For the first time, we investigate the impact of ferroelectric (FE) domain numbers on the negative capacitance (NC) effect, energy dissipation, NC effect voltage window, polarization ramping rate, voltage amplification (A(NC)), and oscillation frequency of a 5-stage HZO-NCFET inverter-based ring oscillator (HZO-NCFET-RO). The results show that HZO-NCFET is suitable for low-voltage and high-speed applications, providing a significant increase in A(NC) concerning the conventional PZT-NCFET and P(VDF-TrFE)-NCFET. Our study suggests that the HZO intrinsic NC effect time scale is tiny and limited by the FE switching. We show that the switching time of HZO is similar to 20 times faster than the traditional FE [Pb(Nb0.04Zr0.28Ti0.68)O-3]. Finally, the proposed 5-stage HZO-NCFET-RO offers superior performance with a 26% higher oscillation frequency and a 94.42% reduction in power dissipation compared to standard 5-stage CMOS inverter-based RO. These findings highlight the potential of HZO-stacked NCFET as an alternative device for the future beyond-CMOS technology.
More
Translated text
Key words
Domains,Ferroelectric,Ferroelectric Switching,Negative Capacitance Effects,Negative Capacitance FET,PZT,P(VDF-TrFE),Ring Oscillator
求助PDF
上传PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest