Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question about the task #7

Open
Wicknight opened this issue Oct 18, 2023 · 1 comment
Open

Question about the task #7

Wicknight opened this issue Oct 18, 2023 · 1 comment

Comments

@Wicknight
Copy link

Hello! I am new to dataset distillation and my question may be shallow. It seems to me that dataset distillation is generally for classification tasks, synthesizing condensed data for each class for efficient training, just like the node classification task in your work. I'm wondering if your work can also apply to the link prediction task on the graph?

@rockcor
Copy link

rockcor commented Nov 23, 2023

Common node classification methods follow the AXW sheme, which facilitates the graph structure learning (fix W and update A/X). Using the same idea, pairwise link prediction task (MLP(v1,v2)), you can fix the W in MLP and update v1,v2. IMP it's pretty trival to handle conflict in v1 when simultaneously updating (v1,v2) and (v1,v3).
For subgraph based link prediction, you can refer to DosCond. For pairwise link prediction, you can just select less positive samples by certain strategy.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants