최소제곱법

수학노트
둘러보기로 가기 검색하러 가기

노트

위키데이터

말뭉치

  1. The least squares method provides the overall rationale for the placement of the line of best fit among the data points being studied.[1]
  2. In contrast to a linear problem, a non-linear least squares problem has no closed solution and is generally solved by iteration.[1]
  3. An example of the least squares method is an analyst who wishes to test the relationship between a company’s stock returns, and the returns of the index for which the stock is a component.[1]
  4. The line of best fit determined from the least squares method has an equation that tells the story of the relationship between the data points.[1]
  5. When calculating least squares regressions by hand, the first step is to find the means of the dependent and independent variables.[2]
  6. Modeling Workhorse Linear least squares regression is by far the most widely used modeling method.[3]
  7. It is what most people mean when they say they have used "regression", "linear regression" or "least squares" to fit a model to their data.[3]
  8. Not only is linear least squares regression the most widely used modeling method, but it has been adapted to a broad range of situations that are outside its direct scope.[3]
  9. Linear least squares regression also gets its name from the way the estimates of the unknown parameters are computed.[3]
  10. One of the first applications of the method of least squares was to settle a controversy involving Earth’s shape.[4]
  11. Measuring the shape of the Earth using the least squares approximationThe graph is based on measurements taken about 1750 near Rome by mathematician Ruggero Boscovich.[4]
  12. The method of least squares is now widely used for fitting lines and curves to scatterplots (discrete sets of data).[4]
  13. There are multiple methods of dealing with this task, with the most popular and widely used being the least squares estimation.[5]
  14. The least-squares method is a crucial statistical method that is practised to find a regression line or a best-fit line for the given pattern.[6]
  15. The method of least squares is generously used in evaluation and regression.[6]
  16. The method of least squares actually defines the solution for the minimization of the sum of squares of deviations or the errors in the result of each equation.[6]
  17. The least-squares method is often applied in data fitting.[6]
  18. Introduction Curve Fitting Toolbox™ software uses the method of least squares when fitting data.[7]
  19. To obtain the coefficient estimates, the least-squares method minimizes the summed square of residuals.[7]
  20. Least Squares Curve Fitting Toolbox software uses the linear least-squares method to fit a linear model to data.[7]
  21. Fit the model by weighted least squares.[7]
  22. Least-squares problems fall into two categories: linear or ordinary least squares and nonlinear least squares, depending on whether or not the residuals are linear in all unknowns.[8]
  23. The following discussion is mostly presented in terms of linear functions but the use of least squares is valid and practical for more general families of functions.[8]
  24. In that work he claimed to have been in possession of the method of least squares since 1795.[8]
  25. However, to Gauss's credit, he went beyond Legendre and succeeded in connecting the method of least squares with the principles of probability and to the normal distribution.[8]
  26. Variations of the problem of fitting a function to a set of data: curvilinear relationships, weighted least squares, nonlinear squares, etc. are analyzed by Draper and Smith (1966).[9]
  27. The least squares method allows one to estimate the line of a population regression for which the sum of the squares is a minimum.[9]
  28. The least squares method is a good procedure to estimate the regression line for the population.[9]
  29. 0 for all i ≠ j Bacon (1953) describes the least squares method of fitting a line for different conditions and analyzes the goodness of fitting results from different experiments.[9]
  30. , linear least squares fitting may be applied iteratively to a linearized form of the function until convergence is achieved.[10]
  31. The Least Squares Regression Line is the line that makes the vertical distance from the data points to the regression line as small as possible.[11]
  32. It’s called a “least squares” because the best line of fit is one that minimizes the variance (the sum of squares of the errors).[11]
  33. Ordinary least squares regression (OLS) is usually just called “regression” in statistics.[11]
  34. If you are performing regression analysis, either by hand or using SPSS or Excel, you’ll actually be using the least squares method.[11]
  35. When the sum of the squares of the residuals is minimized by a liner combination of unknown parameters (∑a i ・x i + b), the method is called linear least-squares method.[12]
  36. When a nonlinear function is used for fitting, it is called nonlinear least-squares method.[12]
  37. Nonlinear least-squares method includes cases where fitting of unknown parameters is executed by numerical calculations without assuming a specific nonlinear function.[12]
  38. However, most people consider the Least-Squares Method more accurate, as it computes Fixed and Variable Costs mathematically.[13]
  39. The Least Squares model aims to define the line that minimizes the sum of the squared errors.[13]
  40. We need to be careful with outliers when applying the Least-Squares method, as it is sensitive to strange values pulling the line towards them.[13]
  41. (–) The Least-Squares method might yield unreliable results when the data is not normally distributed.[13]
  42. Regression analysis makes use of mathematical methods such as least squares to obtain a definite relationship between the predictor variable (s) and the target variable.[14]
  43. The least-squares method is one of the most effective ways used to draw the line of best fit.[14]
  44. The least squares regression method works by minimizing the sum of the square of the errors as small as possible, hence the name least squares.[14]
  45. In this section, we will be running a simple demo to understand the working of Regression Analysis using the least squares regression method.[14]
  46. Very often the least squares method (LSM) is used for determination of estimation of unknown parameter values.[15]
  47. With a view toward facilitating proper use of computer curve‐fitting programs, the method of least squares for fitting smooth curves to experimental data is discussed.[16]
  48. Among the existing methods, the least squares estimator in Tong and Wang (2005) is shown to have nice statistical properties and is also easy to implement.[17]
  49. In this paper, we propose two least squares estimators for the error variance in heteroscedastic nonparametric regression: the intercept estimator and the slope estimator.[17]
  50. The least squares estimator achieves the asymptotically optimal rate that is usually possessed by residual-based estimators only.[17]
  51. Nevertheless, most of the above methods, including the least squares method, only applied to nonparametric regression models with homoscedastic errors.[17]

소스

메타데이터

위키데이터

Spacy 패턴 목록

  • [{'LOWER': 'least'}, {'LOWER': 'squares'}, {'LEMMA': 'method'}]
  • [{'LOWER': 'least'}, {'OP': '*'}, {'LOWER': 'squares'}, {'LEMMA': 'analysis'}]
  • [{'LOWER': 'least'}, {'OP': '*'}, {'LOWER': 'squares'}, {'LEMMA': 'method'}]
  • [{'LOWER': 'least'}, {'LOWER': 'squares'}, {'LEMMA': 'analysis'}]
  • [{'LOWER': 'least'}, {'LEMMA': 'square'}]