Hey,
I'm trying to recreate the wall to wall example, but after we create the datacube and try to run it through the encoder.
I'm getting a error of wrong shapes
EinopsError: Error while processing repeat-reduction pattern "B D -> B L D".
Input tensor shape: torch.Size([48]). Additional info: {'L': 1024}.
Wrong shape: expected 2 dims. Received 1-dim tensor.
EinopsError Traceback (most recent call last)
File /opt/anaconda3/envs/claymodel/lib/python3.11/site-packages/einops/einops.py:522, in reduce(tensor, pattern, reduction, **axes_lengths)
521 shape = backend.shape(tensor)
--> 522 recipe = _prepare_transformation_recipe(pattern, reduction, axes_names=tuple(axes_lengths), ndim=len(shape))
523 return _apply_recipe(
524 backend, recipe, cast(Tensor, tensor), reduction_type=reduction, axes_lengths=hashable_axes_lengths
525 )
File /opt/anaconda3/envs/claymodel/lib/python3.11/site-packages/einops/einops.py:365, in _prepare_transformation_recipe(pattern, operation, axes_names, ndim)
364 if ndim != len(left.composition):
--> 365 raise EinopsError(f"Wrong shape: expected {len(left.composition)} dims. Received {ndim}-dim tensor.")
366 left_composition = left.composition
EinopsError: Wrong shape: expected 2 dims. Received 1-dim tensor.
During handling of the above exception, another exception occurred:
EinopsError Traceback (most recent call last)
Cell In[87], line 7
1 #Run the model
2
3 #pass the datacube prepared above to the model to create embeddings.
4 #this will create 1 embedding vector for each image we downloaded
6 with torch.no_grad():
...
--> 533 raise EinopsError(message + "\n {}".format(e))
EinopsError: Error while processing repeat-reduction pattern "B D -> B L D".
Input tensor shape: torch.Size([48]). Additional info: {'L': 1024}.
Wrong shape: expected 2 dims. Received 1-dim tensor.
what exactly is the issue?
Hey,
I'm trying to recreate the wall to wall example, but after we create the datacube and try to run it through the encoder.
I'm getting a error of wrong shapes
EinopsError: Error while processing repeat-reduction pattern "B D -> B L D".
Input tensor shape: torch.Size([48]). Additional info: {'L': 1024}.
Wrong shape: expected 2 dims. Received 1-dim tensor.
EinopsError Traceback (most recent call last)
File /opt/anaconda3/envs/claymodel/lib/python3.11/site-packages/einops/einops.py:522, in reduce(tensor, pattern, reduction, **axes_lengths)
521 shape = backend.shape(tensor)
--> 522 recipe = _prepare_transformation_recipe(pattern, reduction, axes_names=tuple(axes_lengths), ndim=len(shape))
523 return _apply_recipe(
524 backend, recipe, cast(Tensor, tensor), reduction_type=reduction, axes_lengths=hashable_axes_lengths
525 )
File /opt/anaconda3/envs/claymodel/lib/python3.11/site-packages/einops/einops.py:365, in _prepare_transformation_recipe(pattern, operation, axes_names, ndim)
364 if ndim != len(left.composition):
--> 365 raise EinopsError(f"Wrong shape: expected {len(left.composition)} dims. Received {ndim}-dim tensor.")
366 left_composition = left.composition
EinopsError: Wrong shape: expected 2 dims. Received 1-dim tensor.
During handling of the above exception, another exception occurred:
EinopsError Traceback (most recent call last)
Cell In[87], line 7
1 #Run the model
2
3 #pass the datacube prepared above to the model to create embeddings.
4 #this will create 1 embedding vector for each image we downloaded
6 with torch.no_grad():
...
--> 533 raise EinopsError(message + "\n {}".format(e))
EinopsError: Error while processing repeat-reduction pattern "B D -> B L D".
Input tensor shape: torch.Size([48]). Additional info: {'L': 1024}.
Wrong shape: expected 2 dims. Received 1-dim tensor.
what exactly is the issue?