Keras LSTM state -


i run lstm in keras , output plus states. thing in tf

with tf.variable_scope("rnn"):       time_step in range(num_steps):         if time_step > 0: tf.get_variable_scope().reuse_variables()         (cell_output, state) = cell(inputs[:, time_step, :], state)         outputs.append(cell_output) 

is there way in keras can last state , feed new inputs when lenght of sequence huge. aware of stateful=true want have access states while training too. know using scan not loop want save states , on next run, make them starting states lstm. in nutshell, both output , states.

since lstm layer, layer can have 1 output in keras (correct me if wrong), can not 2 output simultaneously without modifying source code.

recently hacking keras implement advance structure, thoughts, might not like, did works. doing override keras layer can access tensor representing hidden states.

firstly, can check call() function in keras/layers/recurrent.py how keras did work:

def call(self, x, mask=none):     # input shape: (nb_samples, time (padded zeros), input_dim)     # note .build() method of subclasses must define     # self.input_spec complete input shape.     input_shape = self.input_spec[0].shape     if k._backend == 'tensorflow':         if not input_shape[1]:             raise exception('when using tensorflow, should define '                             'explicitly number of timesteps of '                             'your sequences.\n'                             'if first layer embedding, '                             'make sure pass "input_length" '                             'argument. otherwise, make sure '                             'the first layer has '                             'an "input_shape" or "batch_input_shape" '                             'argument, including time axis. '                             'found input shape @ layer ' + self.name +                             ': ' + str(input_shape))     if self.stateful:         initial_states = self.states     else:         initial_states = self.get_initial_states(x)     constants = self.get_constants(x)     preprocessed_input = self.preprocess_input(x)      last_output, outputs, states = k.rnn(self.step, preprocessed_input,                                          initial_states,                                          go_backwards=self.go_backwards,                                          mask=mask,                                          constants=constants,                                          unroll=self.unroll,                                          input_length=input_shape[1])     if self.stateful:         self.updates = []         in range(len(states)):             self.updates.append((self.states[i], states[i]))      if self.return_sequences:         return outputs     else:         return last_output 

secondly, should override our layer, here simple script:

import keras.backend k keras.layers import input, lstm class mylstm(lstm):    def call(self, x, mask=none):    # .... blablabla, right before return     # add line access states    self.extra_output = states     if self.return_sequences:    # .... blablabla, end     # should copy **exactly same code** keras.layers.recurrent  = input(shape=(...)) lstm = mylstm(20) output = lstm(i) # calling, call `call()` , create `lstm.extra_output` extra_output = lstm.extra_output # refer target  calculate_function = k.function(inputs=[i], outputs=extra_output+[output]) # use function calculate them **simultaneously**.  

Comments

Popular posts from this blog

javascript - Slick Slider width recalculation -

jsf - PrimeFaces Datatable - What is f:facet actually doing? -

angular2 services - Angular 2 RC 4 Http post not firing -