Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

关于classifier normalize? #10

Open
HX-idiot opened this issue Oct 2, 2020 · 6 comments
Open

关于classifier normalize? #10

HX-idiot opened this issue Oct 2, 2020 · 6 comments

Comments

@HX-idiot
Copy link

HX-idiot commented Oct 2, 2020

你好,看了您的paper受益匪浅。
这边有个问题想问一下,paper中提到,借鉴propensity score的思想,对logit进行normalize。不知道这个是什么原理呢?如果不加normalize对de-confound的影响大吗?

@KaihuaTang
Copy link
Owner

可以参考“An Introduction to Propensity Score Methods for Reducing the Effects of Confounding in Observational Studies”这篇文章。如果不用normalization相当于de-confound不彻底,会导致之后TDE不work,即,因为confound path的存在,所以得不到direct effect

@HX-idiot
Copy link
Author

HX-idiot commented Oct 3, 2020

follow了一下,所以您是将当前观测到的样本当作是从do(x)之后的distribution中采样来的,因此给他们赋予了一个$weight = 1/P(x|M=m)$对吧?那为什么P(x|M=m) 可以近似为 L2-norm呢?这是基于什么得出的呢?

@KaihuaTang
Copy link
Owner

关于怎么设计这个normalization,我们还是偏经验主义。因为Propensity Score的思想只告诉我们需要对effect做balancing,但具体怎么设计其实是很开放的。

@sky186
Copy link

sky186 commented Nov 17, 2020

@KaihuaTang
您好,您有试过 例如加入 circle loss ,数据不平衡的训练中加入该CausalNormClasseifer, 因为您的论文中有特征图的可视化,发现学习到的特征图更紧凑, 所以考虑 是否可以加入CausalNormClasseifer 分支 当作辅助训练,

@KaihuaTang
Copy link
Owner

@KaihuaTang
您好,您有试过 例如加入 circle loss ,数据不平衡的训练中加入该CausalNormClasseifer, 因为您的论文中有特征图的可视化,发现学习到的特征图更紧凑, 所以考虑 是否可以加入CausalNormClasseifer 分支 当作辅助训练,

我没有试过,听起来好像挺靠谱的。你可以试一下看work不work

@sky186
Copy link

sky186 commented Nov 17, 2020

@KaihuaTang
好的,我试一下,然后反馈一下。
我是直接加一个 CausalNormClasseifer + crosse+entropy loss, 使用它有什么超参数需要注意的吗?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants