I have a 128x128 image stored as a 2D numpy.ndarray (it's effectively a heatmap, so each entry is just a scalar value). I have identified:
- one point on my image,
P = (x0, y0) - a direction
v = [v0, v1] - a line
L, which passes throughPand is perpendicular tov - a scale factor
s(suppose for concreteness thatsis a percentage)
I want to stretch my image away from the line L, i.e. along the direction of v, by s. What I mean by "stretch away from L" is that points on L should remain invariant under the transformation. The following diagram depicts the case where s is positive, so all points move away from L:
If s were negative, then I'd want to move all points closer to L.
If L passed through the origin, then this would just be a simple linear transformation, and I could use a normalized v times (1 + s) and some unit vector along L as my basis vectors. However, because L does not necessarily pass through the origin, I'm under the impression that this is some type of affine transformation.
Some preferred qualities for a solution:
- I would prefer to just modify the contents of the image and keep it the same size, rather than resizing the image in any way
- the "field of view" of the image should stay the same, i.e.
Lshould stay in the same place in the image before and after the transformation
The first thing I tried was some type of image resizing, using cv2.resize. However, this only allows linear transformations where L passes through the origin, and it also requires me to resize the actual image itself.
The next thing I tried was using cv2.getAffineTransform along with cv2.warpAffine. Since cv2.getAffineTransform takes two sets of points (one before, one after), I decided to take two colinear points and calculate their midpoint (these are the 'before' points), then scale the original two points away from the midpoint to get the 'after' points. An image of the process:
My code:
def get_transformation_matrix(pt1 : tuple[float, float], pt2 : tuple[float, float], scale_factor : float) -> np.ndarray:"""Returns the transformation matrix to scale an image along the line containing `pt1` and `pt2`, centered at their midpoint, by `scale_factor` (a percentage increase/decrease).""" # the vector from pt1 to pt2, divided by 2 # AKA the vector from pt1 to center # AKA the vector from center to pt2 slope_vector = np.array([pt2[0] - pt1[0], pt2[1] - pt1[1]]) / 2 # midpoint of the two given points center = np.array([(pt2[0] - pt1[0]) / 2 + pt1[0], (pt2[1] - pt1[1]) / 2 + pt1[1]]) # new points are just center ± resized slope_vector new_pt1 = center - slope_vector * (1 + scale_factor / 100) new_pt2 = center + slope_vector * (1 + scale_factor / 100) initial_points = np.float32([pt1, center, pt2]) final_points = np.float32([new_pt1, center, new_pt2]) return cv2.getAffineTransform(initial_points, final_points)M = wf.get_transformation_matrix([100, 100], [200, 100], 10)print(M)dst = cv2.warpAffine(data, M, data.shape)However, the M returned was just [[0. 0. 0.] [0. 0. 0.]], and dst was a completely black image. I'm not sure what went wrong, but every tutorial I found used a triangle for the initial and final points with cv2.getAffineTransform, so maybe using colinear points caused an issue?
The final thing I looked at is scipy.ndimage.affine_transform, which looks promising but requires me to know the transformation matrix and offset, and I'm not really sure how to find them.
Finally, I'm adding this subquestion in this post because it's related to the above: suppose that instead of stretching away from an invariant line L, I want to do a uniform dilation about the point P. I assume that this is going to be very similar to the case with the line, with the exception that the basis vector along L will no longer be a unit vector, but I figured I might as well confirm.
Apologies if this question has been asked before: I looked around and didn't find much, but it's possible that I just don't know what search terms to use.

