The neutron star-black hole binary (NSBH) system has been considered one of
the promising detection candidates for ground-based gravitational-wave (GW)
detectors such as LIGO and Virgo. The tidal effects of neutron stars (NSs) are
imprinted on the GW signals emitted from NSBHs as well as binary neutron stars.
In this work, we study how accurately the parameter $\lambda_{\rm NS}$ can be
measured in GW parameter estimation for NSBH signals. We set the parameter
range for the NSBH sources to $[4M_{\odot}, 10M_{\odot}]$ for the black hole
mass, $[1M_{\odot}, 2M_{\odot}]$ for the NS mass, and $[-0.9, 0.9]$ for the
dimensionless black hole spin. For realistic populations of sources distributed
in different parameter spaces, we calculate the measurement errors of
$\lambda_{\rm NS}$ ($\sigma_{\lambda_{\rm NS}}$) using the Fisher matrix
method. In particular, we perform a single-detector analysis using the advanced
LIGO and the Cosmic Explorer detectors and a multi-detector analysis using the
2G (advanced LIGO-Hanford, advanced LIGO-Livingstone, advanced Virgo, and
KAGRA) and the 3G (Einstein Telescope and Cosmic Explorer) networks. We show
the distribution of $\sigma_{\lambda_{\rm NS}}$ for the population of sources
as a one-dimensional probability density function. Our result shows that the
probability density function curves are similar in shape between advanced LIGO
and Cosmic Explorer, but Cosmic Explorer can achieve $\sim 15$ times better
accuracy overall in the measurement of $\lambda_{\rm NS}$. In the case of the
network detectors, the probability density functions are maximum at
$\sigma_{\lambda_{\rm NS}} \sim 130$ and $\sim 4$ for the 2G and the 3G
networks, respectively, and the 3G network can achieve $\sim 10$ times better
accuracy overall.