It is well known that the family of parameterized functions representable by fully-connected feedforward neural networks with ReLU activation functions is a class of piecewise linear functions with exactly a finite number of pieces. It is not well known that for all fixed architectures of ReLU neural networks, the parameter space allows a positive dimensional space of symmetry. So the local functional dimension around a given parameter is lower than the parametric dimension. In this work, we carefully define the notion of functional dimension, show that it is heterogeneous across the parameter space of ReLU neural network functions, and continue to investigate. [14] When
[5] – When the functional dimension reaches its theoretical maximum. We also study quotient spaces and fibers of realization maps from parameter space to function space, and provide examples of truncated fibers, fibers with non-constant functional dimension, and fibers with symmetric groups acting non-transitively.