neural network architectures. which one are you. tag yourself.

Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
neural network architectures. which one are you. tag yourself.
#1
neural network architectures. which one are you. tag yourself.
[Image: neuralnetworks.png]
#2
RE: neural network architectures. which one are you. tag yourself.
I'm more of a NEAT than anything, but if I had to choose I'd go with an LSTM with its weights attached to a parameter updater.
#3
RE: neural network architectures. which one are you. tag yourself.
You must be registered to view this content.
#4
RE: neural network architectures. which one are you. tag yourself.
You must be registered to view this content.
#5
RE: neural network architectures. which one are you. tag yourself.

It is a very Kaynato question.
#6
RE: neural network architectures. which one are you. tag yourself.
https://twitter.com/ncasenmare/status/80...9856462848

i can't read this
signature
#7
RE: neural network architectures. which one are you. tag yourself.
say that in public not to my face
signature
#8
RE: neural network architectures. which one are you. tag yourself.
you're worth hearing
signature
#9
RE: neural network architectures. which one are you. tag yourself.
i was too stubborn to do cbt,
instead what always happens, what always happens,
is continued self-destruction
#10
RE: neural network architectures. which one are you. tag yourself.
i like that neural network more
signature
#11
RE: neural network architectures. which one are you. tag yourself.
me too.

variational autoencoders are p. great.
also considering how I can use a lstm over a description of program actions.
#12
RE: neural network architectures. which one are you. tag yourself.
are there any good tutorials on neural networks?
#13
RE: neural network architectures. which one are you. tag yourself.
deeplearning.net has some wonderful tutorials. I should also link you my reading as it's been really really useful.

I have pdfs all stored away... Sometime i oughta talk with you about this actually

Generally it's like this, in "stuff to understand"
Linear Algebra
Optimization as Loss minimization
Gradient Descent
The Perceptron
Linear Classifiers
Function fitting
Stochastic Gradient Descent
Activation functions
Backpropagation in Multilayer perceptrons

So on.

There's plenty of branches that are esp interesting like VAE and GANN though but once you get the general ideas of Dimension Reduction and Gaussian discovery it makes sense.

Generally learn python
Use tensorflow (my curr favorite) or Theano mostly
Either have a GPU computer or use AWS EC2.