Bidirectional LSTM
The basic idea of bidirectional recurrent neural nets is to present each training sequence forwards and backwards to two separate recurrent nets, both of which are connected to the same output layer. This means that for every point in a given sequence, the BRNN has complete, sequential information about all points before and after it.
Because the net is free to use as much or as little of this context as necessary, there is no need to find a time-window or target delay size, which can also access long-range context in both input directions.