numpy.polyfit — NumPy v2.0 Manual (2024)

numpy.polyfit(x, y, deg, rcond=None, full=False, w=None, cov=False)[source]#

Least squares polynomial fit.

Note

This forms part of the old polynomial API. Since version 1.4, thenew polynomial API defined in numpy.polynomial is preferred.A summary of the differences can be found in thetransition guide.

Fit a polynomial p(x) = p[0] * x**deg + ... + p[deg] of degree degto points (x, y). Returns a vector of coefficients p that minimisesthe squared error in the order deg, deg-1, … 0.

The Polynomial.fit classmethod is recommended for new code as it is more stable numerically. Seethe documentation of the method for more information.

Parameters:
xarray_like, shape (M,)

x-coordinates of the M sample points (x[i], y[i]).

yarray_like, shape (M,) or (M, K)

y-coordinates of the sample points. Several data sets of samplepoints sharing the same x-coordinates can be fitted at once bypassing in a 2D-array that contains one dataset per column.

degint

Degree of the fitting polynomial

rcondfloat, optional

Relative condition number of the fit. Singular values smaller thanthis relative to the largest singular value will be ignored. Thedefault value is len(x)*eps, where eps is the relative precision ofthe float type, about 2e-16 in most cases.

fullbool, optional

Switch determining nature of return value. When it is False (thedefault) just the coefficients are returned, when True diagnosticinformation from the singular value decomposition is also returned.

warray_like, shape (M,), optional

Weights. If not None, the weight w[i] applies to the unsquaredresidual y[i] - y_hat[i] at x[i]. Ideally the weights arechosen so that the errors of the products w[i]*y[i] all have thesame variance. When using inverse-variance weighting, usew[i] = 1/sigma(y[i]). The default value is None.

covbool or str, optional

If given and not False, return not just the estimate but also itscovariance matrix. By default, the covariance are scaled bychi2/dof, where dof = M - (deg + 1), i.e., the weights are presumedto be unreliable except in a relative sense and everything is scaledsuch that the reduced chi2 is unity. This scaling is omitted ifcov='unscaled', as is relevant for the case that the weights arew = 1/sigma, with sigma known to be a reliable estimate of theuncertainty.

Returns:
pndarray, shape (deg + 1,) or (deg + 1, K)

Polynomial coefficients, highest power first. If y was 2-D, thecoefficients for k-th data set are in p[:,k].

residuals, rank, singular_values, rcond

These values are only returned if full == True

  • residuals – sum of squared residuals of the least squares fit

  • rank – the effective rank of the scaled Vandermonde

    coefficient matrix

  • singular_values – singular values of the scaled Vandermonde

    coefficient matrix

  • rcond – value of rcond.

For more details, see numpy.linalg.lstsq.

Vndarray, shape (deg + 1, deg + 1) or (deg + 1, deg + 1, K)

Present only if full == False and cov == True. The covariancematrix of the polynomial coefficient estimates. The diagonal ofthis matrix are the variance estimates for each coefficient. If yis a 2-D array, then the covariance matrix for the k-th data setare in V[:,:,k]

Warns:
RankWarning

The rank of the coefficient matrix in the least-squares fit isdeficient. The warning is only raised if full == False.

The warnings can be turned off by

>>> import warnings>>> warnings.simplefilter('ignore', np.exceptions.RankWarning)

See also

polyval

Compute polynomial values.

linalg.lstsq

Computes a least-squares fit.

scipy.interpolate.UnivariateSpline

Computes spline fits.

Notes

The solution minimizes the squared error

\[E = \sum_{j=0}^k |p(x_j) - y_j|^2\]

in the equations:

x[0]**n * p[0] + ... + x[0] * p[n-1] + p[n] = y[0]x[1]**n * p[0] + ... + x[1] * p[n-1] + p[n] = y[1]...x[k]**n * p[0] + ... + x[k] * p[n-1] + p[n] = y[k]

The coefficient matrix of the coefficients p is a Vandermonde matrix.

polyfit issues a RankWarning when the least-squares fit isbadly conditioned. This implies that the best fit is not well-defined dueto numerical error. The results may be improved by lowering the polynomialdegree or by replacing x by x - x.mean(). The rcond parametercan also be set to a value smaller than its default, but the resultingfit may be spurious: including contributions from the small singularvalues can add numerical noise to the result.

Note that fitting polynomial coefficients is inherently badly conditionedwhen the degree of the polynomial is large or the interval of sample pointsis badly centered. The quality of the fit should always be checked in thesecases. When polynomial fits are not satisfactory, splines may be a goodalternative.

References

[1]

Wikipedia, “Curve fitting”,https://en.wikipedia.org/wiki/Curve_fitting

[2]

Wikipedia, “Polynomial interpolation”,https://en.wikipedia.org/wiki/Polynomial_interpolation

Examples

>>> import warnings>>> x = np.array([0.0, 1.0, 2.0, 3.0, 4.0, 5.0])>>> y = np.array([0.0, 0.8, 0.9, 0.1, -0.8, -1.0])>>> z = np.polyfit(x, y, 3)>>> zarray([ 0.08703704, -0.81349206, 1.69312169, -0.03968254]) # may vary

It is convenient to use poly1d objects for dealing with polynomials:

>>> p = np.poly1d(z)>>> p(0.5)0.6143849206349179 # may vary>>> p(3.5)-0.34732142857143039 # may vary>>> p(10)22.579365079365115 # may vary

High-order polynomials may oscillate wildly:

>>> with warnings.catch_warnings():...  warnings.simplefilter('ignore', np.exceptions.RankWarning)...  p30 = np.poly1d(np.polyfit(x, y, 30))...>>> p30(4)-0.80000000000000204 # may vary>>> p30(5)-0.99999999999999445 # may vary>>> p30(4.5)-0.10547061179440398 # may vary

Illustration:

>>> import matplotlib.pyplot as plt>>> xp = np.linspace(-2, 6, 100)>>> _ = plt.plot(x, y, '.', xp, p(xp), '-', xp, p30(xp), '--')>>> plt.ylim(-2,2)(-2, 2)>>> plt.show()
numpy.polyfit — NumPy v2.0 Manual (1)
numpy.polyfit — NumPy v2.0 Manual (2024)
Top Articles
Ucf Cost Calculator
Florida Lottery (FL) - Winning Numbers & Results
Can ETH reach 10k in 2024?
Chambersburg star athlete JJ Kelly makes his college decision, and he’s going DI
Lexington Herald-Leader from Lexington, Kentucky
Myhr North Memorial
Optum Medicare Support
Fcs Teamehub
State Of Illinois Comptroller Salary Database
Prices Way Too High Crossword Clue
Which Is A Popular Southern Hemisphere Destination Microsoft Rewards
finaint.com
Morgan And Nay Funeral Home Obituaries
Teenleaks Discord
Nashville Predators Wiki
Sam's Club La Habra Gas Prices
Po Box 35691 Canton Oh
Keck Healthstream
X-Chromosom: Aufbau und Funktion
I Saysopensesame
The Blind Showtimes Near Amc Merchants Crossing 16
Quest: Broken Home | Sal's Realm of RuneScape
Grimes County Busted Newspaper
Never Give Up Quotes to Keep You Going
Loslaten met de Sedona methode
kvoa.com | News 4 Tucson
Www Pointclickcare Cna Login
4 Methods to Fix “Vortex Mods Cannot Be Deployed” Issue - MiniTool Partition Wizard
Vht Shortener
Login.castlebranch.com
Neteller Kasiinod
Maths Open Ref
Best Laundry Mat Near Me
Smayperu
Sf Bay Area Craigslist Com
Where Can I Cash A Huntington National Bank Check
Here’s how you can get a foot detox at home!
Tamilyogi Ponniyin Selvan
Hindilinks4U Bollywood Action Movies
Wrigley Rooftops Promo Code
Craigslist Florida Trucks
Emily Tosta Butt
Myrtle Beach Craigs List
Natasha Tosini Bikini
Does Target Have Slime Lickers
Oakley Rae (Social Media Star) – Bio, Net Worth, Career, Age, Height, And More
Definition of WMT
Makemkv Key April 2023
Deshuesadero El Pulpo
How To Win The Race In Sneaky Sasquatch
Provincial Freeman (Toronto and Chatham, ON: Mary Ann Shadd Cary (October 9, 1823 – June 5, 1893)), November 3, 1855, p. 1
Bloons Tower Defense 1 Unblocked
Latest Posts
Article information

Author: Ouida Strosin DO

Last Updated:

Views: 6054

Rating: 4.6 / 5 (76 voted)

Reviews: 83% of readers found this page helpful

Author information

Name: Ouida Strosin DO

Birthday: 1995-04-27

Address: Suite 927 930 Kilback Radial, Candidaville, TN 87795

Phone: +8561498978366

Job: Legacy Manufacturing Specialist

Hobby: Singing, Mountain biking, Water sports, Water sports, Taxidermy, Polo, Pet

Introduction: My name is Ouida Strosin DO, I am a precious, combative, spotless, modern, spotless, beautiful, precious person who loves writing and wants to share my knowledge and understanding with you.