Diagnosing Bias in the Gender Representation of HCI Research Participants: How It Happens and Where We Are.
ACM Conference on Human Factors in Computing Systems (CHI)(2021)CCF A
Univ British Columbia | Korea Adv Inst Sci & Technol
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance

被引用0
Collecting and Reporting Race and Ethnicity Data in HCI
被引用24
Examining Identity as a Variable of Health Technology Research for Older Adults: A Systematic Review
被引用13
被引用31
Who Am I, and Who Are You, and Who Are We? A Scientometric Analysis of Gender and Geography in HCI
被引用5
RMS: Removing Barriers to Analyze the Availability and Surge Pricing of Ridesharing Services
被引用1
Barriers to Online Dementia Information and Mitigation.
被引用13
被引用1
Are We There Yet? Feminist Approaches in Information Science
被引用0
Literature Reviews in HCI: A Review of Reviews.
被引用8
被引用7
Not Only WEIRD but “uncanny”? A Systematic Review of Diversity in Human–Robot Interaction Research
被引用1
15 Years of (who)man Robot Interaction: Reviewing the H in Human-Robot Interaction
被引用16
被引用4
Eleven Years of Gender Data Visualization: A Step Towards More Inclusive Gender Representation
被引用0
Stay Vigilant: the Threat of a Replication Crisis in VR Locomotion Research
被引用2
被引用2
被引用0