HDM: A Composable Framework for Big Data Processing.

IEEE Transactions on Big Data(2018)

引用 16|浏览32
暂无评分
摘要
Over the past years, frameworks such as MapReduce and Spark have been introduced to ease the task of developing big data programs and applications. However, the jobs in these frameworks are roughly defined and packaged as executable jars without any functionality being exposed or described. This means that deployed jobs are not natively composable and reusable for subsequent development. Besides, ...
更多
查看译文
关键词
Big Data,Optimization,Distributed databases,Sparks,Pipeline processing,Data analysis,Runtime
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要