如何在tensorflow中实现双向rnn
单层双向rnn
tensorflow中已经提供了双向rnn的接口,它就是tf.nn.bidirectional_dynamic_rnn(). 我们先来看一下这个接口怎么用.
1 bidirectional_dynamic_rnn( 2 cell_fw, #前向 rnn cell 3 cell_bw, #反向 rnn cell 4 inputs, #输入序列. 5 sequence_length=None,# 序列长度 6 initial_state_fw=None,#前向rnn_cell的初始状态 7 initial_state_bw=None,#反向rnn_cell的初始状态 8 dtype=None,#数据类型 9 parallel_iterations=None, 10 swap_memory=False, 11 time_major=False, 12 scope=None 13 )
返回值:一个tuple(outputs, outputs_states), 其中,outputs是一个tuple(outputs_fw, outputs_bw). 关于outputs_fw和outputs_bw,如果time_major=True则它俩也是time_major的,vice versa. 如果想要concatenate的话,直接使用tf.concat(outputs, 2)即可.
如何使用:
bidirectional_dynamic_rnn 在使用上和 dynamic_rn
n是非常相似的. 定义前向和反向rnn_cell 定义前向和反向rnn_cell的初始状态 准备好序列 调用bidirectional_dynamic_rnn import tensorflow as tf from tensorflow.contrib import rnn cell_fw = rnn.LSTMCell(10) cell_bw = rnn.LSTMCell(10) initial_state_fw = cell_fw.zero_state(batch_size) initial_state_bw = cell_bw.zero_state(batch_size) seq = ... seq_length = ... (outputs, states)=tf.nn.bidirectional_dynamic_rnn(cell_fw, cell_bw, seq, seq_length, initial_state_fw,initial_state_bw) out = tf.concat(outputs, 2)