The Cms Computing, Software And Analysis Challenge

17TH INTERNATIONAL CONFERENCE ON COMPUTING IN HIGH ENERGY AND NUCLEAR PHYSICS (CHEP09)(2010)

引用 5|浏览1
暂无评分
摘要
The CMS experiment has performed a comprehensive challenge during May 2008 to test the full scope of offline data handling and analysis activities needed for data taking during the first few weeks of LHC collider operations. It constitutes the first full-scale challenge with large statistics under the conditions expected at the start-up of the LHC, including the expected initial mis-alignments and mis-calibrations for each sub-detector, and event signatures and rates typical for low instantaneous luminosity. Particular emphasis has been given to the prompt reconstruction workflows, and to the procedures for the alignment and calibration of each sub-detector. The latter were performed with restricted latency using the same computing infrastructure that will be used for real data, and the resulting calibration and alignment constants were used to re-reconstruct the data at Tier-1 centres. The paper addresses the goals and practical experience from the challenge, as well as the lessons learned in view of LHC data taking.
更多
查看译文
关键词
conditional expectation,data handling
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要