Ceres from finding process, talk about least squares polynomial fitting achieve

  Star-studded history of science, gem if Galaxy. These cattle are basically genius, but there are some nobodies with incredible, startle the world startled vulgar speculation and ranks among the superstars. For example, Mendeleev, the whole of a blank periodic table, chemist attracted worldwide do fill in the blank. Another German high school teacher, Ming Huan John Titius (Johann Daniel Titius), and wrote such a series in 1766:

  (0+4)/10 = 0.4

  (3+4)/10 = 0.7

  (6+4)/10 = 1.0

  (12+4)/10 = 1.6

  (24+4)/10 = 2.8

  (48+4)/10 = 5.2

  (96+4)/10 = 10.0

  (192+4)/10 = 19.6

  (384+4)/10 = 38.8

  ...

  1

  2

  3

  4

  5

  6

  7

  8

  9

  10

  At that time, people known to have six planets in the solar system, namely Mercury, Venus, Earth, Mars, Jupiter, Saturn. If the Earth-Sun distance (about 150 million km) is an astronomical unit, the distance of six planets from the sun, just close to the Titius of this series, and left a fascination! This series is called Titius - Bode given it.

  In 1781, British German Herschel close to 19.6 positions (that is, the eighth in the series) discovered Uranus, since then, people will believe this is a must. This is according to some, that is the position in the 2.8 series should correspond to the fifth planet, but has not been found. As a result, many astronomers and amateur astronomers will be with great enthusiasm, he embarked on a journey to find a new planet.

  1801 New Year's night, the Italian priest Giuseppe Piazzi still attentively watching the stars. Suddenly, he found a very small stars from the telescope, just Titius - Bode given the location of the 2.8. The planet in a few days observation period of constantly changing position. When Piazzi think further observe the asteroid, he became ill. When he recovered, think of looking for the asteroid, missing it. Piazzi did not give up this opportunity, he believes this may be the planet for the people has not been found.

  Astronomers The discovery of Piazzi hold different views. Some people think that Piazzi was right, some people think that this could be a comet, a minority on this planet to say, talking about the astronomical community.

  At this point, and a big God appeared, he was a mathematical genius Gauss. According to Piazzi's observations, Gauss its excellent mathematical ability, only an hour even if the shape of the orbit Fengyun mysterious asteroid, noting when it would occur in which a sky. The night of 31 December 1801, the German amateur astronomer Olbers, Gauss predicted in time with this piece of the telescope at the sky. As expected, the Fengyun mysterious asteroid once again magically appeared!

  From John Titius, to Father Giuseppe Piazzi, to the Prince of Mathematics Gauss, Olbers amateur astronomers, many cattle were jointly discovered this object, named Ceres. Ceres is the smallest in the solar system, is the only dwarf planet located in the asteroid belt. Ceres was once used as a standard: whatever than its big planet, and the planet can be seen as the same level planet, or as asteroids. For example Pluto, Ceres than the big, old nine solar system has been called, until 2006, to be classified as a dwarf planet, only to withdraw from the ranks of the nine planets.

  So, exactly how to calculate the Gauss Ceres orbit and make predictions it? Is actually using a least squares fit trajectory. Gaussian least squares method used in 1809 published his book "celestial movement theory".

  Let's pretend Gauss, to a cosplay, do experience the feeling of cattle.

  The figure is assumed Piazzi Ceres from zero to ten when the observation records, corresponding to each of the red dot observed abscissa represents time, the ordinate indicates the position corresponding to Ceres. If there is a curve (very nearly) after 11 observation points on the map, just find this curve equation:

  y=f(x) y = f(x)

  y=f(x)

  We can predict random Ceres position 11:00, 12:00 in the. Gauss thought so then. But how to find the equation of this curve? This question is difficult to live Gauss. He used a k-th polynomial g (x) g (x) g (x) to approximate f (x) f (x) f (x):

  f(x)≈g(x)=a0+a1x+a2x2+a3x3+...+akxk f(x) \approx g(x) = a_0 + a_1x + a_2x^2 + a_3x^3 + ... + a_kx^k

  f(x)≈g(x)=a

  0

  ​

  +a

  1

  ​

  x+a

  2

  ​

  x

  2

  +a

  3

  ​

  x

  3

  +...+a

  k

  ​

  x

  k

  By selecting appropriate coefficients a0, a1, a2, a3, ..., ak a_0, a_1, a_2, a_3, ..., a_ka

  0

  ​

  ,a

  1

  ​

  ,a

  2

  ​

  ,a

  3

  ​

  ,...,a

  k

  ​

  , So that error:

  loss=∑10i=0(g(i)−f(i))2 loss = \sum_{i=0}^{10}(g(i)-f(i))^2

  loss=

  i=0

  ∑

  10

  ​

  (g(i)−f(i))

  2

  Minimum, we can consider g (x) g (x) g (x) is the curve of the equation we are looking for. Gaussian method of least squares, is based on observed data and a given polynomial k kk error to find the minimum of a set of polynomial coefficients a0, a1, a2, a3, ..., ak a_0, a_1, a_2, a_3, .. ., a_ka

  0

  ​

  ,a

  1

  ​

  ,a

  2

  ​

  ,a

  3

  ​

  ,...,a

  k

  ​

  , With this set of coefficients, it is possible to obtain g (x) g (x) g (x), then calculate g (11) g (11) g (11), g (12) g (12) g (12 ) value, so that you can predict the Ceres position 11, 12 of the.

  We are not Gaussian, not the least squares method, but fortunately numpy provides us with a powerful tool that allows us to continue to pretend Gauss.

  >>> import numpy as np

  >>> _x = np.linspace(0, 10, 11)

  >>> _y = np.array([-0.3, -0.5, -0.2, -0.3, 0, 0.4, 0.2, -0.3, 0.2, 0.5, 0.4])

  >>> np.polyfit(_x, _y, 3)

  array([ 0.00066045, -0.01072261, 0.12684538, -0.43146853])

  1

  2

  3

  4

  5

  Here, _x is Piazzi observed time series, _y is Piazzi observed Ceres location sequence, we have chosen polynomial degree is 3. Performing np.polyfit (xs, ys, 3), is to find the four coefficients a third order polynomial least squares method, so that the cubic polynomial and error the minimum observation data. The cubic polynomial written like this:

  g(x)=−0.43146853+0.12684538x−0.01072261x2+0.00066045x3 g(x) = -0.43146853 + 0.12684538x - 0.01072261x^2 + 0.00066045x^3

  g(x)=−0.43146853+0.12684538x−0.01072261x

  2

  +0.00066045x

  3

  This function is used to verify the observation data:

  >>> g = np.poly1d(np.polyfit(_x, _y, 3))

  >>> g(_x)

  array([-0.43146853, -0.31468531, -0.21538462, -0.12960373, -0.05337995,

  0.01724942, 0.08624709, 0.15757576, 0.23519814, 0.32307692,

  0.42517483])

  >>> loss = np.sum(np.square(g(_x)-_y))

  >>> loss

  0.4857342657342658

  >>> import matplotlib.pyplot as plt

  >>> plt.plot(_x, _y, c='r', ls='', marker='o')

  >>> plt.plot(_x, g(_x), c='g', ls=':')

  >>> plt.show()

  1

  2

  3

  4

  5

  6

  7

  8

  9

  10

  11

  12

  The polynomial g (x) g (x) g (x) instead of f (x) with three times f (x) f (x), the minimum error is 0.4857. Observation data and g (x) g (x) g (x) is approximated with painting data, the following results

  Obviously, this effect is not found fit Ceres. Do not worry, we can also try higher powers of the polynomial, see how it works. The following code, while plotted 4-9 polynomial fit results and error.

  import numpy as np

  import matplotlib.pyplot as plt

  plt.rcParams [ 'font.sans-serif'] = [ 'FangSong']

  plt.rcParams['axes.unicode_minus'] = False

  _x = np.linspace(0, 10, 11)

  _y = np.array ([- 0.3, -0.5, -0.2, -0.3, 0, 0.4, 0.2, -0.3, 0.2, 0.5, 0.4])

  g3 = np.poly1d(np.polyfit(_x, _y, 3))

  g4 = np.poly1d(np.polyfit(_x, _y, 4))

  g5 = np.poly1d(np.polyfit(_x, _y, 5))

  g6 = np.poly1d(np.polyfit(_x, _y, 6))

  g7 = np.poly1d(np.polyfit(_x, _y, 7))

  g8 = np.poly1d(np.polyfit(_x, _y, 8))

  g9 = np.poly1d(np.polyfit(_x, _y, 9))

  loss3 = np.sum(np.square(g3(_x)-_y))

  loss4 = np.sum(np.square(g4(_x)-_y))

  loss5 = np.sum(np.square(g5(_x)-_y))

  loss6 = np.sum(np.square(g6(_x)-_y))

  loss7 = np.sum(np.square(g7(_x)-_y))

  loss8 = np.sum(np.square(g8(_x)-_y))

  loss9 = np.sum(np.square(g9(_x)-_y))

  plt.plot(_x, _y, c='r', ls='', marker='o')

  plt.plot (_x, g3 (_x), label = u 'cubic polynomial, error% 0.4f'% loss3)

  plt.plot (_x, g4 (_x), label = u 'quartic polynomial, error% 0.4f'% loss4)

  plt.plot (_x, g5 (_x), label = u 'fifth-order polynomial, error% 0.4f'% loss5)

  plt.plot (_x, g6 (_x), label = u 'six polynomial, error% 0.4f'% loss6)

  plt.plot (_x, g7 (_x), label = u 'seven polynomial, error% 0.4f'% loss7)

  plt.plot (_x, g8 (_x), label = u 'eight polynomial, error% 0.4f'% loss8)

  plt.plot (_x, g9 (_x), label = u 'nine polynomial, error% 0.4f'% loss9)

  plt.legend ()

  plt.show()

  1

  2

  3

  4

  5

  6

  7

  8

  9

  10

  11

  12

  13

  14

  15

  16

  17

  18

  19

  20

  21

  22

  23

  24

  25

  26

  27

  28

  29

  30

  31

  32

  33

  34

  35

  36

  9 can be seen that the error polynomial fitting, 0.0010 to have small, precisely fit curve through almost all of the observation points. This effect can be satisfactory enough. The figure below retaining only nine polynomial fitting results, look clearer.

  Well, cosplay is over. The real celestial bodies orbit would not be so perverted, positions of celestial bodies is not a single value can be expressed out. The above example, purely polynomial fitting. Sometimes, like to be fitted to the data is shown in the figure, it does not matter, the fitting method above is still valid.

  Fitting code is as follows:

  import numpy as np

  import matplotlib.pyplot as plt

  plt.rcParams [ 'font.sans-serif'] = [ 'FangSong']

  plt.rcParams['axes.unicode_minus'] = False

  xs = np.linspace(-1, 1, 11)

  ys = np.array([-0.3, -0.5, -0.2, -0.3, 0, 0.4, 0.2, -0.3, 0.2, 0.5, 0.4])

  xm = np.linspace(-1, 1, 201)

  ym = ((xm**2-1)**3 + 0.5)*np.sin(2*xm) + np.random.random(201)/10 - 0.1

  fs4 = np.poly1d(np.polyfit(xs, ys, 4))

  fs5 = np.poly1d(np.polyfit(xs, ys, 5))

  fs6 = np.poly1d(np.polyfit(xs, ys, 6))

  fs7 = np.poly1d(np.polyfit(xs, ys, 7))

  fs8 = np.poly1d(np.polyfit(xs, ys, 8))

  fs9 = np.poly1d(np.polyfit(xs, ys, 9))

  fs10 = np.poly1d(np.polyfit(xs, ys, 10))

  fm2 = np.poly1d(np.polyfit(xm, ym, 2))

  fm3 = np.poly1d(np.polyfit(xm, ym, 3))

  FM4 = np.poly1d (np.polyfit (xm, ym, 4))

  fm5 = np.poly1d(np.polyfit(xm, ym, 5))

  fm6 = np.poly1d(np.polyfit(xm, ym, 6))

  fm7 = np.poly1d(np.polyfit(xm, ym, 7))

  plt.subplot (211)

  plt.plot(xs, ys, c='r', ls='', marker='o')

  plt.plot (xs, fs4 (xs), label = u 'quartic polynomial')

  plt.plot (xs, fs5 (xs), label = u 'quintic polynomial')

  plt.plot (xs, fs6 (xs), label = u 'six polynomial')

  plt.plot (xs, fs7 (xs), label = u 'polynomial seven')

  plt.plot (xs, fs8 (xs), label = u 'eight polynomials')

  plt.plot (xs, fs9 (xs), label = u 'polynomial nine')

  plt.legend ()

  Zhengzhou flow of the hospital: http: //mobile.zzyyrl.com/

  That is good flow of the hospital in Zhengzhou: http: //mobile.zzyyrl.com/

  plt.subplot (212)

  plt.plot(xm, ym, c='g', ls='', marker='.')

  plt.plot (xm, fm2 (xm), label = u 'quadratic polynomials')

  plt.plot (xm, fm3 (xm), label = u 'cubic polynomial')

  plt.plot (xm, fm4 (xm), label = u 'quartic polynomial')

  plt.plot (xm, fm5 (xm), label = u 'quintic polynomial')

  plt.plot (xm, fm6 (xm), label = u 'six polynomial')

  plt.plot (xm, fm7 (xm), label = u 'polynomial seven')

  plt.legend ()

  plt.show()

Guess you like

Origin www.cnblogs.com/wode1/p/Daniel.html