Keras RNN model - problem with determining the input shape

I'm building a multi-class text model in Keras. The general design for the start involves processing texts with the quanteda package into it's native 'Document-feature matrix' format which is then passed into the model. I've successively built a simple dense NN without problems. The issue starts when I change the model design to an RNN.

My input shape is:

> dim(dfm_train)
[1] 16083  1868

Then my model design looks the following:

model <- keras_model_sequential() %>% 
  layer_simple_rnn(units = 64, input_shape = c(dim(dfm_train)[[2]]), return_sequences = TRUE) %>% 
  layer_dropout(rate = 0.5) %>% 
  layer_simple_rnn(units = 64, return_sequences = TRUE) %>% 
  layer_dropout(rate = 0.5) %>% 
  layer_dense(units = 16, activation = "softmax")

And the error I receive is the following:

Error in py_call_impl(callable, dots$args, dots$keywords) : 
  ValueError: Input 0 is incompatible with layer simple_rnn_28: expected ndim=3, found ndim=2 

I've been experimenting with return_sequences, changing the input dimension as well as adding layer_reshape into the mix but unfortunately I'm not able to get past this error. Can anyone help?


input data (as well as targets) to RNNs have to be 3-d.

This post

has a detailed explanation (jump to "Reshaping the data").

1 Like

This topic was automatically closed 21 days after the last reply. New replies are no longer allowed.

If you have a query related to it or one of the replies, start a new topic and refer back with a link.