3 Matching Annotations
  1. Apr 2022
    1. If Xn is the activation vector for layer n, and Wn is theweights connecting layer n to layer n +1, for some neuron i in layer n, some neuron j in layer n +1, and some class t(using threshold θn+1)

      Note: For every layer/class you only compute one Q value. You don't compute a Q-value for every node, you compute a Q-value for every layer.


    1. There aresome theoretical considerations pointing to the possibility thatsome components of the DLNN may play more crucial rolethan the others in in the representation of emergent symbolicknowledge in networks, in particular in biological neuralnetworks

      Doing this process for spiking neural networks might be interesting.


    1. xcor

      This is wrong. In order to check the x coordinate for patches you have to use "pxcor" instead of "xcor" as "xcor" exclusively only works for turtles.