TY - GEN
T1 - LogPar
T2 - 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2020
AU - Yin, Kejing
AU - Afshar, Ardavan
AU - Ho, Joyce C.
AU - CHEUNG, Kwok Wai
AU - Zhang, Chao
AU - Sun, Jimeng
N1 - Funding Information:
This research is in part supported by General Research Fund RGC/ HKBU12201219 and RGC/HKBU12202117 from the Research Grants Council of Hong Kong, the National Science Foundation award IIS-1418511, CCF-1533768, IIS-1838042 and IIS-1838200, the National Institute of Health award 1R01MD011682-01, R56HL138415 and 1K01LM012924-01.
PY - 2020/8/23
Y1 - 2020/8/23
N2 - Binary data with one-class missing values are ubiquitous in real-world applications. They can be represented by irregular tensors with varying sizes in one dimension, where value one means presence of a feature while zero means unknown (i.e., either presence or absence of a feature). Learning accurate low-rank approximations from such binary irregular tensors is a challenging task. However, none of the existing models developed for factorizing irregular tensors take the missing values into account, and they assume Gaussian distributions, resulting in a distribution mismatch when applied to binary data. In this paper, we propose Logistic PARAFAC2 (LogPar) by modeling the binary irregular tensor with Bernoulli distribution parameterized by an underlying real-valued tensor. Then we approximate the underlying tensor with a positive-unlabeled learning loss function to account for the missing values. We also incorporate uniqueness and temporal smoothness regularization to enhance the interpretability. Extensive experiments using large-scale real-world datasets show that LogPar outperforms all baselines in both irregular tensor completion and downstream predictive tasks. For the irregular tensor completion, LogPar achieves up to 26% relative improvement compared to the best baseline. Besides, LogPar obtains relative improvement of 13.2% for heart failure prediction and 14% for mortality prediction on average compared to the state-of-the-art PARAFAC2 models.
AB - Binary data with one-class missing values are ubiquitous in real-world applications. They can be represented by irregular tensors with varying sizes in one dimension, where value one means presence of a feature while zero means unknown (i.e., either presence or absence of a feature). Learning accurate low-rank approximations from such binary irregular tensors is a challenging task. However, none of the existing models developed for factorizing irregular tensors take the missing values into account, and they assume Gaussian distributions, resulting in a distribution mismatch when applied to binary data. In this paper, we propose Logistic PARAFAC2 (LogPar) by modeling the binary irregular tensor with Bernoulli distribution parameterized by an underlying real-valued tensor. Then we approximate the underlying tensor with a positive-unlabeled learning loss function to account for the missing values. We also incorporate uniqueness and temporal smoothness regularization to enhance the interpretability. Extensive experiments using large-scale real-world datasets show that LogPar outperforms all baselines in both irregular tensor completion and downstream predictive tasks. For the irregular tensor completion, LogPar achieves up to 26% relative improvement compared to the best baseline. Besides, LogPar obtains relative improvement of 13.2% for heart failure prediction and 14% for mortality prediction on average compared to the state-of-the-art PARAFAC2 models.
KW - binary tensor completion
KW - computational phenotyping
KW - PARAFAC2 factorization
KW - tensor factorization
UR - http://www.scopus.com/inward/record.url?scp=85090406475&partnerID=8YFLogxK
U2 - 10.1145/3394486.3403213
DO - 10.1145/3394486.3403213
M3 - Conference contribution
AN - SCOPUS:85090406475
T3 - Proceedings of the ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
SP - 1625
EP - 1635
BT - KDD 2020 - Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining
PB - Association for Computing Machinery (ACM)
Y2 - 23 August 2020 through 27 August 2020
ER -