I am running into an error when trying to run my code for this Assignment Cost minimization problem in Python and was hoping someone could illuminate me on what I have done incorrectly here.
The question:
My code:
import numpy as npimport timen = 10 # n: number of jobs/workerscost = np.random.rand(n,n) # generate an n-by-n random matrixfrom scipy.optimize import linprogc = cost.reshape((n**2,1)) # reshape the cost matrix to a column vector#row constraints (sum of each row equals 1)A_eq_rows = np.ones((n, n**2))b_eq_rows = np.ones(n)#column constraints (sum of each column equals 1)A_eq_cols = np.zeros((n, n**2))for i in range(n): A_eq_cols[i, i::n] = 1b_eq_cols = np.ones(n)#solve for A_eq and A_eq=np.vstack((A_eq_rows, A_eq_cols))b_eq=np.hstack((b_eq_rows, b_eq_cols))start_time = time.time() # start timingres = linprog(c, None, None, A_eq, b_eq)end_time = time.time() # end timingelapsed_LP = end_time - start_timeprint("Minimum cost = ", res.fun)print("Elapsed time = ", elapsed_LP)
I keep getting my Minimum Cost as None and am not entirely sure what I've done wrong with this. I have a decent fundamental understanding of how to resolve this problem but have limited experience in doing this through Python.
I know that it has n^2 variables and 2n-1 linearly independent constraints. Therefore, a BFS of the assignment problem has 2n-1 basic variables, and for each assignments, exactly n of the n2 variables are nonzero so every BFS is degenerate. Any help in this would be greatly appreciated.
NOTE: I am aware I can solve this through the function scipy.optimize.linear_sum_assignment()
and have done this but wanted to compare my answer by running this problem as a linear program.