My question is that what will happen if iI arrange different activation functions in the same layer of a neural network and continue the same trend for the other hidden layers.
Suppose iI have 3 relurelu units in the starting and 3 tanhtanh after that and other activation function iI the same hidden layer and for the other hidden layers iI am scaling all the nodes with the same scale (decreasing/increasing) and the arrangement and order of the activation function is not changing.