You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Want to run the network for gray scale images (single channel)
I get this error while running network on gray images
Traceback (most recent call last):
File "demo_imgs.py", line 100, in
demo(args)
File "demo_imgs.py", line 50, in demo
image1 = load_image(imfile1)
File "demo_imgs.py", line 29, in load_image
img = torch.from_numpy(img).permute(2, 0, 1).float()
RuntimeError: number of dims don't match in permute
I tried copying same gray values for all 3 channel but results are not very good.
I see eth3d is gray scale image dataset so I also tried with eth3d network shared.
But I still get above error.
Can you please share what change is needed to adapt network to gray images?
The text was updated successfully, but these errors were encountered:
Want to run the network for gray scale images (single channel)
I get this error while running network on gray images
Traceback (most recent call last):
File "demo_imgs.py", line 100, in
demo(args)
File "demo_imgs.py", line 50, in demo
image1 = load_image(imfile1)
File "demo_imgs.py", line 29, in load_image
img = torch.from_numpy(img).permute(2, 0, 1).float()
RuntimeError: number of dims don't match in permute
I tried copying same gray values for all 3 channel but results are not very good.
I see eth3d is gray scale image dataset so I also tried with eth3d network shared.
But I still get above error.
Can you please share what change is needed to adapt network to gray images?
The text was updated successfully, but these errors were encountered: