Artificial neural networks (ANNs) perform a variety of tasks in machine learning from prediction to adaptive control to natural language processing. While there are many different types of ANNs, no one type excels at all tasks. The best performance is achieved by matching the ANN type to the task at hand.
One type of ANN particularly well suited for data with a time series component is the Long Short-Term Memory (LSTM) network. LSTMs have explicitly defined memory units that store a “hidden” memory state. The network learns to modify the memory using a “forget” gate and an “input” gate. These two gates allow the network to erase information from the memory state and to store or add new information. During training, the network learns what information to store and when to erase information. In addition, an “output” gate learns when to retrieve information from the memory state.
An LSTM network was one of the models used for our wildfire fire occurrence prediction model. The memory state allowed the model to learn concepts that depend on the history of a location, such as longer periods of high temperatures or no precipitation which can lead to higher risks of fire starts.