In the context of the bidirectional-LSTMs neural parser (Kiperwasser
and Goldberg, 2016), an idea is proposed to initialize the parsing state
without punctuation-tokens but using them for the BiLSTM sentence
encoding. The relevant information brought by the punctuation-tokens
should be implicitly learned using the errors of the recurrent contributions
only.