Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Reproducing Results #2

Open
galyona opened this issue Aug 13, 2017 · 6 comments
Open

Reproducing Results #2

galyona opened this issue Aug 13, 2017 · 6 comments

Comments

@galyona
Copy link

galyona commented Aug 13, 2017

Hello!

I've fixed a few small bugs in the code (perhaps version compatibility issues) but could not get the training to converge.
Were you able to reproduce the results in the original article using your code?
If so, could you post some guidelines?

Thanks!

@hardikbansal
Copy link
Owner

No, I haven't reproduced the results.

Have you read the code. Did you find any error or anything that I have not handled correctly?

Thanks,
Hardik

@LynnHo
Copy link

LynnHo commented Oct 26, 2017

image

@hardikbansal I think it's not suitable to invert the attribute label for training the encoder. It should be more a sensible way, but I can't find it in the paper.

@hardikbansal
Copy link
Owner

Yup, in paper they have flipped the attributes rather have defined two attributes for the same thing. I haven't tried changing it yet.

@hardikbansal
Copy link
Owner

@LynnHo Did you find any other problem in the implementation that you think might be the cause this model is not working?

@LynnHo
Copy link

LynnHo commented Nov 2, 2017

@hardikbansal Sorry, I haven't find out any solution yet.

@hardikbansal
Copy link
Owner

I have made necessary changes. There was an error in loss function and also the representation of attributes. I was not converting the attributes from -1,1 domain to 0,1 domain. I have not run it yet though, as i don't have enough resources to run it as of now.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants