LSTM

Introduction

LSTM stards for Long-short term memory, it is a variant of RNN network. The strong part of LSTM archetect is it has memory build-in. Imaging the daily conversation, we are able to predict next words based on the context of the conversation because we have remembered the important of part of the context. See example below,

ad_pre

What you can remember after several days probabily only a few key information, and you choose to forget irrelavant information. Here is the things you possible remember after several days

ad_post

As LSTM can remember through time. Therefore, it has wide range of applications like text/speech translation, speech to text translation, audio/video prediction

Limitations

References

Avatar
Terry Pan
Student of Data Science

My research interests include Machine Learning, Data Science, Information Security and Software Engineering. I like to think like a engineer to tackle real world problems.

Related

comments powered by Disqus