Numpy array for encoder_input_data is too big

I am following the “Build Chatbots with Python Skill Path” course to make my first generative chatbot. The course says I need to make one-hot vectors to give to Keras to make a seq2seq model. However, when I run this code:

encoder_input_data = np.zeros((len(input_docs), max_encoder_seq_length, num_encoder_tokens), dtype='float32')

I get the error: “Unable to allocate 18.1 TiB for an array with shape (221616, 382, 58688) and data type float32”

I understand the array is too big, but I’m not sure where to go from here. How can I make it smaller? Should I pick smaller training data, or can I just manually change the parameters?

You could just limit/reduce the size of the training data sample.

This topic was automatically closed 41 days after the last reply. New replies are no longer allowed.