No Support for Relatedness and Kin Selection to Explain High Rates of Conspecific Brood Parasitism in Colonial Red Breasted Mergansers (mergus Serrator)
Canadian Journal of Zoology(2021)SCI 4区SCI 3区
Univ St Anne | McGill Univ
Abstract
Conspecific brood parasitism (CBP) has been observed in approximately half of all species of waterfowl, a philopatric group in which breeding females are frequently locally related. It has been suggested that kin selection can facilitate the evolution of CBP in waterfowl via fitness benefits for the host and parasite. One model demonstrates that discrimination of related and unrelated parasites by the host must be sufficient for kinship to promote CBP, provided that costs of brood parasitism to host fitness are sufficiently low. We parameterized the model using demographic data and behavioural observations from a population of colonial Red-breasted Mergansers (Mergus serrator Linnaeus, 1758) in which 47% of nests were parasitized by conspecifics. The costs of 1–3 foreign eggs to host hatching success were generally small (decline of 1.8% per additional egg). Nevertheless, model outputs revealed that brood parasites maximize their inclusive fitness by avoiding nests of relatives, primarily because of constraints on a host’s ability to detect parasites at the nest. Indeed, hosts spent <8% of the diurnal period at the nest during egg laying, a period when parasite activity is greatest. It is thus highly unlikely that relatedness and kin selection promote brood parasitism in this population.
MoreTranslated text
Key words
alternative reproductive strategies,conspecific brood parasitism,kin selection,relatedness,fitness,hatching success,waterfowl,Red-breasted Merganser,Mergus serrator
求助PDF
上传PDF
View via Publisher
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
- Pretraining has recently greatly promoted the development of natural language processing (NLP)
- We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
- We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
- The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
- Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Upload PDF to Generate Summary
Must-Reading Tree
Example

Generate MRT to find the research sequence of this paper
Related Papers
Host Traits and Lifetime Fitness Costs of Being Parasitized in Red-Breasted Mergansers
FACETS 2021
被引用0
Journal of Coastal Research 2024
被引用0
Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper
GPU is busy, summary generation fails
Rerequest