-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
I'm sorry to bother you,but I have some questions... #10
Comments
i have the same problem. do you fix it? |
Sorry,I have not fixed it yet |
If you have solved this bug,please tell me,thank you! If I fixed it up,I would tell you,too. |
@erhuodaosi Hi, I gotta admit the code is pretty old haha, but a quick fix for that training issue would be to replace the subtraction operator with a comma I think. So pass the two sliced tensors as individual args. The second issue looks to be Qt specific (or it could be that my Qt app code is outdated). Since it seems like some people are interested, I might update this repo to support pytorch 1.0 and Qt properly |
@bobqywei Thank you very much.As for the second question,it could run on ubuntun 16.04LTS,but get error in server,maybe the server does not support QT platform.When it talks to the first question,I would hava a try in your advice. |
i find other version and it works |
@Lvhhhh Thanks a lot,I would have a try. |
First of all,when I python train.py,an error occured:
(py3) [fl@ibcu05 place2]$ python train.py
Loaded training dataset with 1434892 samples and 55116 masks
Loaded model to device...
Setup Adam optimizer...
Setup loss function...
EPOCH:0 of 3 - starting training loop from iteration:0 to iteration:89680
0%| | 0/89680 [00:00<?, ?it/s]
Traceback (most recent call last):
File "train.py", line 141, in
loss_dict = loss_func(image, mask, output, gt)
File "/home/fl/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 489, in call
result = self.forward(*input, **kwargs)
File "/home/fl/place2/loss.py", line 94, in forward
loss_dict["tv"] = total_variation_loss(composed_output, self.l1) * LAMBDAS["tv"]
File "/home/fl/place2/loss.py", line 50, in total_variation_loss
loss = l1(image[:, :, :, :-1] - image[:, :, :, 1:]) + l1(image[:, :, :-1, :] - image[:, :, 1:, :])
File "/home/fl/.local/lib/python3.6/site-packages/torch/nn/modules/module.py", line 489, in call
result = self.forward(*input, **kwargs)
TypeError: forward() missing 1 required positional argument: 'target'
In addition,when I python inpaint.py,
qt.qpa.xcb: could not connect to display
qt.qpa.plugin: Could not load the Qt platform plugin "xcb" in "" even though it was found.
This application failed to start because no Qt platform plugin could be initialized. Reinstalling the application may fix this problem.
Available platform plugins are: eglfs, linuxfb, minimal, minimalegl, offscreen, vnc, wayland-egl, wayland, wayland-xcomposite-egl, wayland-xcomposite-glx, webgl, xcb.
Aborted (core dumped)
I would very appreciate if anyone could help me,thank you!
The text was updated successfully, but these errors were encountered: