Chrome Extension
WeChat Mini Program
Use on ChatGLM

Table Pre-training: A Survey on Model Architectures, Pretraining Objectives, and Downstream Tasks

IJCAI 2022(2022)

Cited 18|Views15
No score
Abstract
Following the success of pre-training techniques in the natural language domain, a flurry of table pre-training frameworks have been proposed and have achieved new state-of-the-arts on various downstream tasks such as table question answering, table type recognition, column relation classification, table search, and formula prediction. Various model architectures have been explored to best capture the characteristics of (semi-)structured tables, especially specially-designed attention mechanisms. Moreover, to fully leverage the supervision signals in unlabeled tables, diverse pre-training objectives have been designed and evaluated, for example, denoising cell values, predicting numerical relationships, and learning a neural SQL executor. This survey aims to provide a comprehensive review of model designs, pre-training objectives, and downstream tasks for table pre-training, and we further share our thoughts on existing challenges and future opportunities.
More
Translated text
Key words
Topic Modeling,Pretrained Models,Predictive
AI Read Science
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Chat Paper
Summary is being generated by the instructions you defined