5This differs only slightly from the convention in the neural network literature, where a weight wij usually represents a connection from a neuron j to a neuron i in some layer. Not specifying which layer is often a cause of confusion, especially in textbooks that attempt to explain backpropagation theory, because one then tries to put into words what would have been far more obvious from a well-chosen notation.