谷歌浏览器插件
订阅小程序
在清言上使用

$\Log$-Sigmoid Activation-Based Long Short-Term Memory for Time Series Data Classification

IEEE transactions on artificial intelligence(2024)

引用 0|浏览8
暂无评分
摘要
With the enhanced usage of Artificial Intelligence (AI) driven applications, the researchers often face challenges in improving the accuracy of the data classification models, while trading off the complexity. In this paper, we address the classification of time series data using the Long Short-Term Memory (LSTM) network while focusing on the activation functions. While the existing activation functions such as sigmoid and $\tanh$ are used as LSTM internal activations, the customizability of these activations stays limited. This motivates us to propose a new family of activation functions, called $\log$ -sigmoid, inside the LSTM cell for time series data classification, and analyze its properties. We also present the use of a linear transformation (e.g., $\log \tanh$ ) of the proposed $\log$ -sigmoid activation as a replacement of the traditional $\tanh$ function in the LSTM cell. Both the cell activation as well as recurrent activation functions inside the LSTM cell are modified with $\log$ -sigmoid activation family while tuning the $\log$ bases. Further, we report a comparative performance analysis of the LSTM model using the proposed and the state-of-the-art activation functions on multiple public time-series databases.
更多
查看译文
关键词
Activation,classification,LSTM,sigmoid
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要