We use a particular machine learning approach, called the genetic algorithms
(GA), in order to place constraints on deviations from general relativity (GR)
via a possible evolution of Newton’s constant $\mu\equiv
G_\mathrm{eff}/G_\mathrm{N}$ and of the dark energy anisotropic stress $\eta$,
both defined to be equal to one in GR. Specifically, we use a plethora of
background and linear-order perturbations data, such as type Ia supernovae,
baryon acoustic oscillations, cosmic chronometers, redshift space distortions
and $E_g$ data. We find that although the GA is affected by the lower quality
of the currently available data, especially from the $E_g$ data, the
reconstruction of Newton’s constant is consistent with a constant value within
the errors. On the other hand, the anisotropic stress deviates strongly from
unity due to the sparsity and the systematics of the $E_g$ data. Finally, we
also create synthetic data based on a next-generation survey and forecast the
limits of any possible detection of deviations from GR. In particular, we use
two fiducial models: one based on the cosmological constant $\Lambda$CDM model
and another on a model with an evolving Newton’s constant, dubbed $\mu$CDM. We
find that the GA reconstructions of $\mu(z)$ and $\eta(z)$ can be constrained
to within a few percent of the fiducial models and in the case of the $\mu$CDM
mocks, they can also provide a strong detection of several $\sigma$s, thus
demonstrating the utility of the GA reconstruction approach.