Skip to content

Self-supervised semi-supervised nonnegative matrix factorization for data clustering

Notifications You must be signed in to change notification settings

amjadseyedi/S4NMF

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

10 Commits
 
 
 
 
 
 
 
 

Repository files navigation

S4NMF

Self-Supervised Semi-Supervised Nonnegative Matrix Factorization for Data Clustering

Jut run 'S4NMF.ipynb'.

Jovan Chavoshinejad, Seyed Amjad Seyedi, Fardin Akhlaghian Tab, navid Salahian

Pattern Recognition 2023

https://doi.org/10.1016/j.patcog.2022.109282

Abstract

Semi-supervised nonnegative matrix factorization exploits the strengths of matrix factorization in successfully learning part-based representation and is also able to achieve high learning performance when facing a scarcity of labeled data and a large amount of unlabeled data. Its major challenge lies in how to learn more discriminative representations from limited labeled data. Furthermore, self-supervised learning has been proven very effective at learning representations from unlabeled data in various learning tasks. Recent research works focus on utilizing the capacity of self-supervised learning to enhance semi-supervised learning. In this paper, we design an effective Self-Supervised Semi-Supervised Nonnegative Matrix Factorization (S4NMF) in a semi-supervised clustering setting. The S4NMF directly extracts a consensus result from ensembled NMFs with similarity and dissimilarity regularizations. In an iterative process, this self-supervisory information will be fed back to the proposed model to boost semi-supervised learning and form more distinct clusters. The proposed iterative algorithm is used to solve the given problem, which is defined as an optimization problem with a well-formulated objective function. In addition, the theoretical and empirical analyses investigate the convergence of the proposed optimization algorithm. To demonstrate the effectiveness of the proposed model in semi-supervised clustering, we conduct extensive experiments on standard benchmark datasets.

Cite

@article{CHAVOSHINEJAD2022109282,
title = {Self-Supervised Semi-Supervised Nonnegative Matrix Factorization for Data Clustering},
journal = {Pattern Recognition},
pages = {109282},
year = {2023},
issn = {0031-3203},
doi = {https://doi.org/10.1016/j.patcog.2022.109282},
url = {https://www.sciencedirect.com/science/article/pii/S0031320322007610},
author = {Jovan Chavoshinejad and Seyed Amjad Seyedi and Fardin Akhlaghian Tab and Navid Salahian},
keywords = {Nonnegative matrix factorization, Semi-supervised learning, Self-supervised learning, Ensemble clustering}
}

About

Self-supervised semi-supervised nonnegative matrix factorization for data clustering

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Jupyter Notebook 100.0%