-
-
Notifications
You must be signed in to change notification settings - Fork 45
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
DC-GAN tutorial based on TensorFlow #108
Conversation
Oops! Spelling mistakes! 😬 |
Co-authored-by: Logan Kilpatrick <[email protected]>
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks @Dsantra92! Just some minor changes from me below:
@Dsantra92 great work so far, if we can address the suggestions above, this should be good to go! |
@logankilpatrick I am done with the writing part, just refactoring the code a bit. I should be able to wrap it up by today. |
Anything more I should add? |
I don't think so, but let me have another look tomorrow. |
Right now I get: ERROR: UndefVarError: gen_ps not defined
Stacktrace:
[1] train_generator!(gen::Chain{Tuple{Dense{typeof(identity), Matrix{Float32}, Flux.Zeros}, BatchNorm{typeof(relu), Vector{Float32}, Float32, Vector{Float32}}, var"#6#7", ConvTranspose{2, 4, typeof(identity), Array{Float32, 4}, Flux.Zeros}, BatchNorm{typeof(relu), Vector{Float32}, Float32, Vector{Float32}}, ConvTranspose{2, 4, typeof(identity), Array{Float32, 4}, Flux.Zeros}, BatchNorm{typeof(relu), Vector{Float32}, Float32, Vector{Float32}}, ConvTranspose{2, 4, typeof(tanh), Array{Float32, 4}, Flux.Zeros}}}, disc::Chain{Tuple{Conv{2, 4, typeof(identity), Array{Float32, 4}, Vector{Float32}}, var"#8#10", Dropout{Float64, Colon}, Conv{2, 4, typeof(identity), Array{Float32, 4}, Vector{Float32}}, var"#9#11", Dropout{Float64, Colon}, typeof(flatten), Dense{typeof(identity), Matrix{Float32}, Vector{Float32}}}}, fake_img::Array{Float32, 4}, opt::ADAM, ps::Zygote.Params, hparams::HyperParams)
@ Main ./REPL[35]:3
[2] train(hparams::HyperParams)
@ Main ./REPL[39]:47
[3] top-level scope
@ REPL[41]:1
[4] top-level scope
@ ~/.julia/packages/CUDA/YpW0k/src/initialization.jl:52 |
Side note: has anyone looked into writing tutorials as literate.jl or jmd files so that they can be easily verified and run? |
Co-authored-by: Brian Chen <[email protected]>
Great work on this @Dsantra92 and thanks for the review @ToucheSir! |
Thanks @logankilpatrick 🙌. This was a great learning experience all thanks to you and @ToucheSir . |
Related to #107