Welcome to DELL@ICDM'18

A child learns to know the world by absorbing external knowledge and thoughts all through the stages of his development. Following a parallel path, there is a new learning paradigm in machine learning - developmental learning, which aims to upgrade the intelligence of machines in a developmental context.

The world is dynamic and evolving. Given the fast development of information technology, every moment witnesses the surge of data produced in almost every industry and sector. Classical data analysis methods, which work in a closed loop and are reluctant to interact with outside environments, thus miss the new data to enrich the system, deliver novel concepts, and bring in new interpretations. Rather remaining stagnant and uninspired, the machine should conform to the trend of the knowledge and keep up with the state of the art. The idea of sustainable development can be reflected in different scenarios, e.g. online learning [1], lifelong learning [2], and learning to learn [3]. Their results suggest that beyond training and testing models, it is more important and challenging to maintain the models in a developmental context.

Recently, several new methods have been proposed for learning: from past labels to new labels [4, 5], from past features to new features [6-8], from past tasks to new tasks [9, 11], and from easy examples to complex examples [12-16]. However, in many cases they seem to not have a deep understanding of the connections between their approaches and a possible common underlying philosophy of developmental learning. Moreover, in the presence of demand for development, the theories, algorithms, and prototypes of traditional stationary learning paradigm become no longer effective or efficient. The main purpose of this first "International Workshop on Developmental Learning" is to bring together scientists, researchers and practitioners from different disciplines (including computer science, psychology, and social science) to present recent advances in developmental learning, address fundamental challenges of development learning in data mining, identify prospective applications of developmental learning, and foster interactions between disciplines to promote the research on development learning.

The topics of interest include, but are not limited to:

  • Addressing similarities and differences between stationary and developmental learning.
  • Designing effective optimization techniques for the constantly emerging big data.
  • Re-visiting traditional stationary learning problems to confront the dynamic environment.
  • Identifying new problems and applications of developmental learning in the real world.
  • Providing in-depth analysis on the theoretical foundations of developmental learning.
  • Developing novel solutions to tackle challenges around developments.
[1] Anderson, T. (Ed.). (2008). The theory and practice of online learning. Athabasca University Press.
[2] Silver, D. L., Yang, Q., & Li, L. (2013, March). Lifelong Machine Learning Systems: Beyond Learning Algorithms. In AAAI Spring Symposium: Lifelong Machine Learning (Vol. 13, p. 05).
[3] Thrun, S., & Pratt, L. (Eds.). (2012). Learning to learn. Springer Science & Business Media.
[4] You, S., Xu, C., Wang, Y., Xu, C., & Tao, D. (2016). Streaming Label Learning for Modeling Labels on the Fly. arXiv preprint arXiv:1604.05449.
[5] Mu, X., Zhu, F., Du, J., Lim, E. P., & Zhou, Z. H. (2017). Streaming Classification with Emerging New Class by Class Matrix Sketching. In AAAI (pp. 2373-2379).
[6] Wu, X., Yu, K., Ding, W., Wang, H., & Zhu, X. (2013). Online feature selection with streaming features. IEEE transactions on pattern analysis and machine intelligence, 35(5), 1178-1192.
[7] Xu, C., Tao, D., & Xu, C. (2016). Streaming View Learning. arXiv preprint arXiv:1604.08291.
[8] Hou, B. J., Zhang, L., & Zhou, Z. H. (2017). Learning with Feature Evolvable Streams. In Advances in Neural Information Processing Systems (pp. 1416-1426).
[9] Ruvolo, P., & Eaton, E. (2013, February). ELLA: An efficient lifelong learning algorithm. In International Conference on Machine Learning (pp. 507-515).
[10] Ruvolo, P., & Eaton, E. (2014, July). Online Multi-Task Learning via Sparse Dictionary Optimization. In AAAI (pp. 2062-2068).
[11] Pentina, A., & Lampert, C. H. (2015). Lifelong learning with non-iid tasks. In Advances in Neural Information Processing Systems (pp. 1540-1548).
[12] Kumar, M. P., Packer, B., & Koller, D. (2010). Self-paced learning for latent variable models. In Advances in Neural Information Processing Systems (pp. 1189-1197).
[13] Jiang, L., Meng, D., Yu, S. I., Lan, Z., Shan, S., & Hauptmann, A. (2014). Self-paced learning with diversity. In Advances in Neural Information Processing Systems (pp. 2078- 2086).
[14] Pentina, A., Sharmanska, V., & Lampert, C. H. (2015). Curriculum learning of multiple tasks. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 5492-5500).
[15] Bengio, Y., Louradour, J., Collobert, R., & Weston, J. (2009, June). Curriculum learning. In Proceedings of the 26th annual international conference on machine learning (pp. 41-48). ACM.
[16] Gong, C., Tao, D., Liu, W., Liu, L., & Yang, J. (2017). Label propagation via teaching- to-learn and learning-to-teach.IEEE transactions on neural networks and learning systems, 28(6), 1452-1465.