Skip to content

Use gradients instead of distances for error #6

@robz

Description

@robz

Currently, optimization uses a sum of minimum distances between points to calculate error. Instead, it could compare the change in angles between points along the paths. Optimization could then create a path that matches the curves of the input path without having to match the x-y coordinates of the points that make up the input path.

One way to accomplish this is to transform each path into a plot of angle deltas between each segment of the path (I'll call this path "gradient space"). For example, consider a straight line as an input path. In "gradient space" this path would be represented as a flat line, with two sharp peaks indicating the endpoints of the path. Thus, the "gradient space" transform describes the curve of the path without the location of its points.

Plots of two paths could be shifted against each other and compared by subtracting each angle delta. The shift position with the minimum sum of errors would be used as the error between the two paths. A complete comparison would require at most N^2 subtractions, where each path has N segments. This is better than the current error function, which requires N^2 euclidean distances, where each path has N points.

One problem with "gradient space" is that it doesn't include scale: a tiny straight line could map to the same angle plot as a very long straight line. I'm not sure of the best way to deal with that, but one solution could be to scale the angle plot by the distance between each point in the original path.

Metadata

Metadata

Assignees

No one assigned

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions