TY - GEN
T1 - Multi-Modal Media Retrieval via Distance Metric Learning for Potential Customer Discovery
AU - LIU, Yang
AU - Gu, Zhonglei
AU - Ko, Tobey H.
AU - LIU, Jiming
N1 - Funding Information:
This work was supported in part by the National Natural Science Foundation of China (NSFC) under Grant 61503317, in part by the General Research Fund (GRF) from the Research Grant Council (RGC) of Hong Kong SAR under Project HKBU12202417, and in part by the SZSTI Grant with the Projct Code JCYJ20170307161544087.
PY - 2019/1/10
Y1 - 2019/1/10
N2 - As social media grown to become an integral part of many people's daily life, brands are quick to launch targeted social media marketing campaign to acquire new potential customers online. To facilitate the potential customer discovery process, a costly and labor intensive manual selection process is done to build a brand portfolio consisting of multimedia data relevant to the brand. To automate this process in a cost-effective way, in this paper, we propose a novel Multi-Modal Distance Metric Learning (M2DML) method, which learns a data-dependent similarity metric from multi-modal media data, aiming at assisting the brands to retrieve appropriate media data from social networks for potential customer discovery. To comprehensively model the supervised information of multi-modal data, M2DML aims to learn both the intra-modality and inter-modality distance metrics simultaneously. To further explore the unsupervised information of the dataset, M2DML aims to preserve the manifold structure of the multi-modal data. The proposed method is then formulated as a standard eigen-decomposition problem and the closed form solution is efficiently computed. Experiments on a standard multi-modal media dataset and a self-collected dataset validate the effectiveness of the proposed method.
AB - As social media grown to become an integral part of many people's daily life, brands are quick to launch targeted social media marketing campaign to acquire new potential customers online. To facilitate the potential customer discovery process, a costly and labor intensive manual selection process is done to build a brand portfolio consisting of multimedia data relevant to the brand. To automate this process in a cost-effective way, in this paper, we propose a novel Multi-Modal Distance Metric Learning (M2DML) method, which learns a data-dependent similarity metric from multi-modal media data, aiming at assisting the brands to retrieve appropriate media data from social networks for potential customer discovery. To comprehensively model the supervised information of multi-modal data, M2DML aims to learn both the intra-modality and inter-modality distance metrics simultaneously. To further explore the unsupervised information of the dataset, M2DML aims to preserve the manifold structure of the multi-modal data. The proposed method is then formulated as a standard eigen-decomposition problem and the closed form solution is efficiently computed. Experiments on a standard multi-modal media dataset and a self-collected dataset validate the effectiveness of the proposed method.
KW - Multi-modal distance metric learning
KW - Multi-modal media retrieval
KW - Potential customer discovery
KW - Social media mining
UR - http://www.scopus.com/inward/record.url?scp=85061932278&partnerID=8YFLogxK
U2 - 10.1109/WI.2018.00-75
DO - 10.1109/WI.2018.00-75
M3 - Conference proceeding
AN - SCOPUS:85061932278
T3 - Proceedings - 2018 IEEE/WIC/ACM International Conference on Web Intelligence, WI 2018
SP - 310
EP - 317
BT - Proceedings - 2018 IEEE/WIC/ACM International Conference on Web Intelligence, WI 2018
PB - IEEE
T2 - 18th IEEE/WIC/ACM International Conference on Web Intelligence, WI 2018
Y2 - 3 December 2018 through 6 December 2018
ER -