Preparing IceBreaking party

SCIA 2005 HighLights

IceBreaking party

Conference Banquet in
Valamo Monastery

Post-conference tour to St.Petersburg


Independent Component Analysis (ICA) is a computational technique for revealing hidden factors that underlie sets of measurements or signals. ICA assumes a statistical model whereby the observed multivariate data, typically given as a large database of samples, are assumed to be linear or nonlinear mixtures of some unknown latent variables. The mixing coefficients are also unknown. The latent variables are nongaussian and mutually independent, and they are called the independent components of the observed data. By ICA, these independent components, also called sources or factors, can be found. Thus ICA can be seen as an extension to Principal Component Analysis and Factor Analysis. ICA is a much richer technique, however, capable of finding the sources when these classical methods fail completely.

In many cases, the measurements are given as a set of parallel signals or images. Typical examples are mixtures of simultaneous sounds or human voices that have been picked up by several microphones, brain images obtained by MRI, or several radio signals arriving at a portable phone. The term blind source separation is used to characterize this problem.

The tutorial course at SCIA 2005 covers the basic principles and approaches to independent component analysis, concentrating on fast algorithms for separating a number of source signals or images from their instantaneous mixtures. Sound or speech separation will not be included. Several applications will be covered in detail: separation of astrophysical images, finding hidden factors in climate patterns, extraction of meaningful signals from biomedical measurements and images, finding texture features, and finding hidden factors from text documents.

This course will enable you to:
- specify a linear mixture model for an application
- define the statistical properties of the signals
- solve the separation model using an ICA algorithm
- judge the correctness of the model.

The course is suitable for students, researchers and practitioners of statistical signal and image processing who want to learn about a powerful blind separation method. The level of the course is intermediate; basic understanding of matrix algebra and multivariate statistics is a prerequisite. Copies of the lecture slides will be provided for the participants.


Erkki Oja ( is Professor of Computer Science at the Laboratory of Computer and Information Science, Helsinki University of Technology, Finland. He received his Dr.Sc. degree in 1977. He has been research associate at Brown University, Providence, RI, and visiting professor at Tokyo Institute of Technology. Dr. Oja is the author or coauthor of more than 270 articles and book chapters on pattern recognition, computer vision, and neural computing, and three books: "Subspace Methods of Pattern Recognition" (RSP and J.Wiley, 1983), which has been translated into Chinese and Japanese, "Kohonen Maps" (Elsevier, 1999), and "Independent Component Analysis" (J. Wiley, 2001). His research interests are in the study of principal component and independent component analysis, self-organization, statistical pattern recognition, and applying artificial neural networks to computer vision and signal processing. Dr. Oja is member of the editorial boards of several journals and has been in the program committees of several recent conferences including ICANN, IJCNN, and ICPR. He is member of the Finnish Academy of Sciences, Founding Fellow of the International Association of Pattern Recognition (IAPR), Fellow of the IEEE, and President of the European Neural Network Society (ENNS).

Pattern Recognition Society of Finland

The International Association for Pattern Recognition