Printing both a (our slope estamate) and the real slope slope: print(a) (Note that the scale is different on the x and y axis.) The obtained slope using least squares fitting (ie the slope of the red line) is very much different than the slope of the lines formed by black dots. Plot results: import matplotlib.pyplot as plt Note that I cannot fit to just one line, as I don't know which points belong to one line. Least squares fitting of a line using np.polyfit(): a, b = np.polyfit(xs, ys, deg=1) Let's have a slope of 2.4 and y-intercepts between 0 and a few hundred. Let's generate new data where the failure of least squares fitting is more obvious. Why simple least squares fitting doesn't seem to work How could I, using machine learning or otherwise, build a function which takes xs and ys as input, and returns a slope estimate of the lines on an image like above? In my real world scenario, I won't know which point belongs to which line, so cannot simply separate the points to different groups and just apply least squares fitting to each group. # bring all x and y points together to single arrays Ys = slope * xs + offset + np.random.normal(loc=0, scale=0.1, size=1) + np.random.normal(loc=0, scale=0.01, size=size) # add some random offset and some random noise Xs = np.random.uniform(low=np.random.uniform(-5,-2), high=np.random.uniform(2,5),size=size) # eachline starts from somewhere -5 and -2 and ends between 2 and 5 Size = np.random.randint(low=50,high=100) # each line will be described by a variable number of points: Offsets = np.arange(10) # we will have 10 lines, each with different y-intercept Slope = 1.2 # all lines have the same slope This is how I generated that image above: import numpy as np To the human eye, they form lines with a constant slope:
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |