I'm having an issue trying to implement an NLP using scipy
, given a dynamic vector x
and a static utility vector u
, where I'm interested in minimizing a utility function of the form:
utility_vector = np.array([0.10, 0.08, 0.05, 0.075, 0.32, 0.21, 0.18, 0.05, 0.03, 0.12])x0 = np.zeros((10, )) groups = [[0, 1, 2, 3], [4, 5], [6, 7, 8, 9]]z_group = [0.25, 0.55, 0.2]def opt_func(x, u, target): utility = (x * u).sum() return (utility - target)**2
Alongside a number of sub-total linear constraints of the form x[selection].sum() = Z1
and a total constraint of x.sum() = 1.0
. However, when applying these constraints via scipy.minimize
, my response vector does not seem to satisfy the sub-total linear constraints being specified.
cons = []# total linear constraint cons.append({'type': 'eq', 'fun': lambda x: 1 - x.sum()})# sub-total linear constraintsfor idx, select in enumerate(groups): cons.append({'type': 'eq', 'fun': lambda x: z_group[idx] - x[select].sum()})bnds = tuple((0, None) for i in range(10))res = minimize(fun=opt_func, x0=x0, method='trust-constr', bounds=bnds, constraints=tuple(cons), args=(utility_vector, 0.16), tol=1e-4) print(res)print(f'\nTotal allocation sum {res.x.sum()}')# outputfor idx, select in enumerate(groups): print(f'{select} fields difference {z_group[idx] - res.x[select].sum()}')
Any help or insight into how I may go about addressing this would be greatly appreciated.