Minimizing Cache Misses in an Event-driven Network Server: A Case Study of TUX

Tampa, FL(2006)

引用 6|浏览6
暂无评分
摘要
We analyze the performance of CPU-bound network servers and demonstrate experimentally that the degra- dation in the performance of these servers under high- concurrency workloads is largely due to inefficient use of the hardware caches. We then describe an approach to speeding up event-driven network servers by optimizing their use of the L2 CPU cache in the context of the TUX web server, known for its robustness to heavy load. Our approach is based on a novel cache-aware memory alloca- tor and a specific scheduling strategy that together ensure that the total working data set of the server stays in the L2 cache. Experiments show that under high concurrency, our optimizations improve the throughput of TUX by up to 40% and the number of requests serviced at the time of failure by 21%.
更多
查看译文
关键词
Internet,cache storage,file servers,storage allocation,CPU-bound network servers,L2 CPU cache,TUX Web server,cache-aware memory allocation,event-driven network server,scheduling strategy
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要