-
Notifications
You must be signed in to change notification settings - Fork 10
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
may help understand, why you think your method coefficients are well reconstructed #3
Comments
4 is clearly better at identifying the support, although it gets the sign wrong for some of the coefficients. In any case this is a synthetic example, attempting to reproduce a figure from the cited paper: please refer to the paper itself (http://www.jmlr.org/proceedings/papers/v51/figueiredo16.pdf) for more information. |
I'm not aware of any others.
The difference may come from experimental conditions rather than
implementation issues as you are suggesting :)
…On Wed, Aug 7, 2019, 20:48 Sandy4321 ***@***.***> wrote:
yes they do have perfect coefficients reconstruction
but in your case also amplitude and also sings are the problem
do you know another OWL implementations to try?
[image: image]
<https://user-images.githubusercontent.com/11426119/62653043-bbadb800-b92a-11e9-86d5-b05e65b64ffe.png>
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#3?email_source=notifications&email_token=AAB3AUKJGBFIE7EJSGLVKXTQDMRHJA5CNFSM4IKCF3RKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD3ZQNYY#issuecomment-519243491>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAB3AUIYY76EXCAGKMJ3RZTQDMRHJANCNFSM4IKCF3RA>
.
|
so what can be done to make your code run correctly? |
What is incorrect? Can you submit a failing test?
I believe the code runs correctly.
…On Wed, Aug 7, 2019, 21:11 Sandy4321 ***@***.***> wrote:
so what can be done to make your code run correctly?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#3?email_source=notifications&email_token=AAB3AUI6XL4JFNX4BYIXYHTQDMUA5A5CNFSM4IKCF3RKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD3ZSIQA#issuecomment-519251008>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAB3AUIVYZZENHYNNKP4ZZDQDMUA5ANCNFSM4IKCF3RA>
.
|
but original true coefficients around 25 samples have negative sign and amplitude of 0.2 (figure1) then your code gives positive and negative signs and amplitude is 0.050 (figure4) ? |
Hi,
All of these conversations are a bit exhausting.
Like I said above, I attribute this recovery error to be due to to
experimental setup and hyperparameters, not to any algorithmic bugs. This
is confirmed by the fact that the code passes the unit test included.
If you can construct an actual example where the provided implementation is
wrong (i.e., when it incorrectly computes the proximal operator), that is
indeed an issue that should be investigated, and I encourage you to submit
such issues (for example, including a failing test case where the returned
solution has suboptimal objective value, or similar.)
The example is an end-to-end demo of noisy recovery, which I would not
expect to yield exact recovery (especially given the poor result of lasso).
If I should expect otherwise, I must be missing something, and I welcome a
(rigorous) explanation.
…On Wed, Aug 7, 2019, 21:20 Sandy4321 ***@***.***> wrote:
but original true coefficients around 25 samples have negative sign and
amplitude of 0.2 (figure1) then your code gives positive and negative signs
and amplitude is 0.050 (figure4) ?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#3?email_source=notifications&email_token=AAB3AUK4BTBBACQWP3EDUG3QDMVCXA5CNFSM4IKCF3RKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD3ZTAHQ#issuecomment-519254046>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAB3AUN6J3THRG6G33WXDIDQDMVCXANCNFSM4IKCF3RA>
.
|
I see thanks
|
They may be generating the array A differently, or use a different noise
level, or use different regularization strength. I am not claiming they are
cheating. If you want to look into this further, you are welcome to play
with the code and adjust things in the example.
I'm not discounting the possibility of a bug right now, but I am not
working on this. If your investigations reveal bugs, please file an issue.
…On Wed, Aug 7, 2019, 21:56 Sandy4321 ***@***.***> wrote:
I see thanks
I assumed that as in paper
Figure 1: Toy example illustrating the qualitatively different behaviour
of OWL and LASSO
perfect reconstruction possible?
as written in paper
Of course, you are ware about this difference, therefore it would be very
interesting how you explain this difference.
Do they cheating by claiming , that they can achieve perfect
reconstruction?
as they wrote:"while OWL successfully
recovers its structure"
We conclude this section with a simple toy example (Fig. 1)
illustrating the qualitatively different behaviour of OWL
and LASSO regularization. In this example, p = 100,
n = 10, and x
? has 20 non-zero components in 2 groups of
size 10, with the corresponding columns of A being highly
correlated. Clearly, n = 10 is insufficient to allow LASSO
to recover x
?
, which is 20-sparse, while OWL successfully
recovers its structure.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#3?email_source=notifications&email_token=AAB3AUPOBNC7LELWYWPMRQ3QDMZHXA5CNFSM4IKCF3RKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD3ZV36I#issuecomment-519265785>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAB3AULYANT5BSPCJ7MRNMTQDMZHXANCNFSM4IKCF3RA>
.
|
as you wrote
it is axactly what I asked above
Then answer is: "try tune hyperparameters", I am right? |
Well, it seems like your question was "how can I make pyowl return the
correct coefficients on the toy problem in the example", not "making the
code run correctly" (the latter implies that something is incorrect)
In this case yes, consider changing the hyperparameters. And keep in mind
that what works on a simulated example might not work the same way on real
data.
…On Wed, Aug 7, 2019, 22:05 Sandy4321 ***@***.***> wrote:
as you wrote
I attribute this recovery error to be due to to
experimental setup and hyperparameters, not to any algorithmic bugs
it is axactly what I asked above
so what can be done to make your code run correctly?
Then answer is: "try tune hyperparameters", I am right?
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#3?email_source=notifications&email_token=AAB3AUPXCANMBWOWMMKL6LTQDM2JNA5CNFSM4IKCF3RKYY3PNVWWK3TUL52HS4DFVREXG43VMVBW63LNMVXHJKTDN5WW2ZLOORPWSZGOD3ZWTCI#issuecomment-519268745>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AAB3AULKAZZIYIPPIG6MQBDQDM2JNANCNFSM4IKCF3RA>
.
|
And keep in mind that what works on a simulated example might not work the same way on real data. |
may help understand, why you think your method coefficients from fig 4 are reconstructed well
when they are still very different from original from fig 1
The text was updated successfully, but these errors were encountered: