-
Notifications
You must be signed in to change notification settings - Fork 15
Closed
Labels
Description
Is your feature request related to a problem? Please describe.
After calling a modopt solver on X, y, I can no longer call a sklearn.linear_model.Lasso on it, because modopt has made X and y not writeable.
Describe the solution you'd like
The check_npdarray has a writeable parameter but it's not exposed at the solver level, hence I don't think the user can control it.
Code to reproduce:
from benchopt.datasets.simulated import make_correlated_data
import numpy as np
from numpy.linalg import norm
from modopt.opt.algorithms import ForwardBackward
from modopt.opt.proximity import SparseThreshold
from modopt.opt.linear import Identity
from modopt.opt.gradient import GradBasic
from sklearn.linear_model import Lasso
np.random.seed(0)
X = np.random.randn(10, 20)
y = np.random.randn(X.shape[0])
lmbd = norm(X.T @ y) / 20
def op(w):
return X @ w
fb = ForwardBackward(
x=np.zeros(X.shape[1]), # this is the coefficient w
grad=GradBasic(
input_data=y, op=op,
trans_op=lambda res: X.T @ res,
),
prox=SparseThreshold(Identity(), lmbd),
beta_param=1.0,
min_beta=1,
metric_call_period=None,
xi_restart=0.96,
restart_strategy='adaptive-1',
s_greedy=1.01,
p_lazy=1,
q_lazy=1,
auto_iterate=False,
progress=False,
cost=None,
)
L = np.linalg.norm(X, ord=2) ** 2
fb.beta_param = 1 / L
fb._beta = 1 / L
fb.iterate(max_iter=100)
# this fails:
Lasso(fit_intercept=False, alpha=lmbd/len(y)).fit(X, y)Are you planning to submit a Pull Request?
- Yes
- No
Reactions are currently unavailable