Downsampling of 1D jitter signals
I have two 1D numpy arrays: x
and y
, where x
contains the x-axis locations for my samples y
. Assuming that x
covers minX
and maxX
, I would like to sample both arrays at regular intervals, for example. np.linspace(minX, maxX, 1000)
...
How can I do this in numpy? Can I solve this problem with 1D interpolation?
source to share
np.interp does 1D linear interpolation:
newx = np.linspace(minX, maxX, 1000)
newy = np.interp(newx, x, y)
Or, using scipy.interpolate.interp1d , you can interpolate using splines. For example, kind='cubic'
gives you third-order spline interpolation:
import scipy.interpolate as interpolate newx = np.linspace(minX, maxX, 1000) newy = interpolate.interp1d(x, y, kind='cubic')(newx)
source to share