You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Following previous thread on the same topic - Issue 4962, Issue 1472, I came up I came up with following code snippet
TS=dimx=dimy=50inpx=Input(shape=(dimx,),dtype='int32',name='inpx')
inpy=Input(shape=(dimy,),dtype='int32',name='inpy')
x=Embedding(1000, 100, input_length=dimx)(inpx)
y=Embedding(1000, 100, input_length=dimx)(inpy)
shared_lstm=Bidirectional(LSTM(100, return_sequences=True),merge_mode='concat')
ques=shared_lstm(x)
########## word-level attention ##############O_q=GlobalMaxPooling1D()(ques)
q_vec=Dense(1)(O_q) #eqn 11 - for ques vector this product is not computed across all time-stampsq_vec=RepeatVector(TS)(q_vec) # replicating q_vec so as to add across all time-stampsh_a=shared_lstm(y)
a_vec=TimeDistributed(Dense(1))(h_a) #eqn 11 - for a_vec sharing weights across all time-stampsm=Merge(mode='sum')([q_vec,a_vec]) #eqn 11 - adding q_vec and a_vecm=Activation(activation='tanh')(m)
s=TimeDistributed(Dense(1,activation='softmax'))(m) #eqn 12 - computing softmax score across all time-stampsh_hat_a=Merge(mode='mul')([h_a,s]) #eqn 13 - scoring via attention weights#mod = Model([inpx,inpy],h_hat_a)O_a=GlobalMaxPooling1D()(h_hat_a)
The text was updated successfully, but these errors were encountered:
My code for attention is correct. I got the intended results on the datasets used in the papers.
I also contacted the authors of Improved Representation Learning for Question Answer Matching about the dimensions of the attention parameters and it looks like I am on the right track.
I am trying to implement word level attention as described in Teaching Machines to Read and Comprehend and Improved Representation Learning for Question Answer Matching. My code works while I am still not sure whether the dimensions of my attention parameters are correct.
This is what it looks like:-
Following previous thread on the same topic - Issue 4962, Issue 1472, I came up I came up with following code snippet
The text was updated successfully, but these errors were encountered: