Deep structured energy based models for anomaly detection
Shuangfei Zhai, Yu Cheng, et al.
ICML 2016
Multi-task learning aims to improve generalization performance of multiple prediction tasks by appropriately sharing relevant information across them. In the context of deep neural networks, this idea is often realized by hand-designed network architectures with layers that are shared across tasks and branches that encode task-specific features. However, the space of possible multi-task deep architectures is combinatorially large and often the final architecture is arrived at by manual exploration of this space, which can be both error-prone and tedious. We propose an automatic approach for designing compact multi-task deep learning architectures. Our approach starts with a thin multi-layer network and dynamically widens it in a greedy manner during training. By doing so iteratively, it creates a tree-like deep architecture, on which similar tasks reside in the same branch until at the top layers. Evaluation on person attributes classification tasks involving facial and clothing attributes suggests that the models produced by the proposed method are fast, compact and can closely match or exceed the state-of-the-art accuracy from strong baselines by much more expensive models.
Shuangfei Zhai, Yu Cheng, et al.
ICML 2016
Eli Schwartz, Leonid Karlinsky, et al.
NeurIPS 2018
Abhishek Kumar, Kahini Wadhawan, et al.
NeurIPS 2018
Zhengping Che, Yu Cheng, et al.
ICDM 2017