Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: generator yielded an element of shape (3,) where an element of shape (?, 3) was expected. #255

Open
millmi17 opened this issue Sep 14, 2021 · 1 comment
Assignees
Labels
bug Something isn't working

Comments

@millmi17
Copy link

I am getting the below error when using model.fit(X_train, early_stopping = False). To get X_train I loaded in a csv using the load_from_csv function which outputs below. I am new to github so if you need anything else to help try and fix it let me know
array([['bill ', 'takes', 'calc'],
['bill ', 'is a ', 'person'],
['fred ', 'takes ', 'eng'],
['fred ', 'takes', 'chem'],
['chem ', 'located in ', 'pike'],
['pike ', 'is a ', 'building'],
['calc', 'located in ', 'smith'],
['calc ', 'is a ', 'building'],
['fred ', 'is a ', 'person']], dtype=object)

This was just a quick thing I put together to see if I could get this to work with ampligraph. I then use the below code to get the train test split. I follow the example code they had for complex E. The info is shown below

X_train, X_test = train_test_split_no_unseen(x, test_size=1)
X_train[1]
array(['bill ', 'takes', 'calc'], dtype=object)

model = ComplEx(batches_count=100,
seed=0,
epochs=1,
k=150,
eta=5,
optimizer='adam',
optimizer_params={'lr':1e-3},
loss='multiclass_nll',
regularizer='LP',
regularizer_params={'p':3, 'lambda':1e-5},
verbose=True)

tf.logging.set_verbosity(tf.logging.ERROR)

model.fit(X_train, early_stopping = False)

InvalidArgumentError Traceback (most recent call last)
~/dss/code-envs/python/kge/lib64/python3.6/site-packages/tensorflow_core/python/client/session.py in _do_call(self, fn, *args)
1364 try:
-> 1365 return fn(*args)
1366 except errors.OpError as e:

~/dss/code-envs/python/kge/lib64/python3.6/site-packages/tensorflow_core/python/client/session.py in _run_fn(feed_dict, fetch_list, target_list, options, run_metadata)
1349 return self._call_tf_sessionrun(options, feed_dict, fetch_list,
-> 1350 target_list, run_metadata)
1351

~/dss/code-envs/python/kge/lib64/python3.6/site-packages/tensorflow_core/python/client/session.py in _call_tf_sessionrun(self, options, feed_dict, fetch_list, target_list, run_metadata)
1442 fetch_list, target_list,
-> 1443 run_metadata)
1444

InvalidArgumentError: ValueError: generator yielded an element of shape (3,) where an element of shape (?, 3) was expected.
Traceback (most recent call last):

File "/home/dataiku/dss/code-envs/python/kge/lib64/python3.6/site-packages/tensorflow_core/python/ops/script_ops.py", line 235, in call
ret = func(*args)

File "/home/dataiku/dss/code-envs/python/kge/lib64/python3.6/site-packages/tensorflow_core/python/data/ops/dataset_ops.py", line 630, in generator_py_func
"of shape %s was expected." % (ret_array.shape, expected_shape))

ValueError: generator yielded an element of shape (3,) where an element of shape (?, 3) was expected.

 [[{{node PyFunc}}]]
 [[IteratorGetNext]]

During handling of the above exception, another exception occurred:

InvalidArgumentError Traceback (most recent call last)
in
2 tf.logging.set_verbosity(tf.logging.ERROR)
3
----> 4 model.fit(X_train, early_stopping = False)

~/dss/code-envs/python/kge/lib/python3.6/site-packages/ampligraph/latent_features/models/ComplEx.py in fit(self, X, early_stopping, early_stopping_params, focusE_numeric_edge_values, tensorboard_logs_path)
371 """
372 super().fit(X, early_stopping, early_stopping_params, focusE_numeric_edge_values,
--> 373 tensorboard_logs_path=tensorboard_logs_path)
374
375 def predict(self, X, from_idx=False):

~/dss/code-envs/python/kge/lib/python3.6/site-packages/ampligraph/latent_features/models/EmbeddingModel.py in fit(self, X, early_stopping, early_stopping_params, focusE_numeric_edge_values, tensorboard_logs_path)
1210 except BaseException as e:
1211 self._end_training()
-> 1212 raise e
1213
1214 def set_filter_for_eval(self):

~/dss/code-envs/python/kge/lib/python3.6/site-packages/ampligraph/latent_features/models/EmbeddingModel.py in fit(self, X, early_stopping, early_stopping_params, focusE_numeric_edge_values, tensorboard_logs_path)
1155 self.sess_train.run(self.ent_emb)[:unique_entities.shape[0], :]
1156 else:
-> 1157 loss_batch, _ = self.sess_train.run([loss, train], feed_dict=feed_dict)
1158
1159 if np.isnan(loss_batch) or np.isinf(loss_batch):

~/dss/code-envs/python/kge/lib64/python3.6/site-packages/tensorflow_core/python/client/session.py in run(self, fetches, feed_dict, options, run_metadata)
954 try:
955 result = self._run(None, fetches, feed_dict, options_ptr,
--> 956 run_metadata_ptr)
957 if run_metadata:
958 proto_data = tf_session.TF_GetBuffer(run_metadata_ptr)

~/dss/code-envs/python/kge/lib64/python3.6/site-packages/tensorflow_core/python/client/session.py in _run(self, handle, fetches, feed_dict, options, run_metadata)
1178 if final_fetches or final_targets or (handle and feed_dict_tensor):
1179 results = self._do_run(handle, final_targets, final_fetches,
-> 1180 feed_dict_tensor, options, run_metadata)
1181 else:
1182 results = []

~/dss/code-envs/python/kge/lib64/python3.6/site-packages/tensorflow_core/python/client/session.py in _do_run(self, handle, target_list, fetch_list, feed_dict, options, run_metadata)
1357 if handle is None:
1358 return self._do_call(_run_fn, feeds, fetches, targets, options,
-> 1359 run_metadata)
1360 else:
1361 return self._do_call(_prun_fn, handle, feeds, fetches)

~/dss/code-envs/python/kge/lib64/python3.6/site-packages/tensorflow_core/python/client/session.py in _do_call(self, fn, *args)
1382 '\nsession_config.graph_options.rewrite_options.'
1383 'disable_meta_optimizer = True')
-> 1384 raise type(e)(node_def, op, message)
1385
1386 def _extend_graph(self):

@sumitpai
Copy link
Contributor

Thanks for reporting this bug. We had not handled the edge case gracefully. Here your training set size is much less than the batch count. So it treats each batch as size of 1. So we are getting this error. I will mark this as a bug. You can try with a smaller batch count - say 3 or 4

@sumitpai sumitpai self-assigned this Sep 16, 2021
@sumitpai sumitpai added the bug Something isn't working label Sep 16, 2021
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants