Welcome to Pengyu Hong's Homepage

Home Resume Research Publication

Automatic Temporal Pattern Extraction and Association

The applications of this research include video summarization, automatic event detection and recognition, automatic concept learning, and so on. The goal is to detect temporal patterns, each of which frequently recurs with variations in a temporal signal sequence. For example, the temporal signal sequences could be the movements of head, hand, and body, a piece of music, and so on. The patterns of the body movement represent the habit of a person. The patterns of the music represent the melodic phases. Those patterns encode the characteristics of the original temporal sequence and can be used for data summarization and pattern detection. Since activities cohere typically beyond modalities, a stand-alone pattern is seldom meaningful. The correlations among the patterns across modalities endow the pattern with meanings. Hence we also investigated how to automatically extract temporal patterns and automatically associate the patterns that extracted from different modalities.

We developed a Threshold Fuzzy State Network to model the temporal signal sequence. Scale Pursuing Process is proposed to train the Threshold Fuzzy State Network. The trained Threshold Fuzzy State Network is able to identify the sub-patterns of the patterns. The sub-patterns are then grown into the whole patterns. The proposed approach is applied to automatic analysis of multiple synchronized temporal sequences from several input modalities. The timing information of the extracted patterns are used to learn the correlations of the patterns across multiple modalities. The learning results lead to automatic concept learning.


References

  • Hong P. and Huang, T. S. Automatic Temporal Pattern Extraction and Association, ICPR 2002
  • Hong P. and Huang, T. S. Learning to Extract Multi-Temporal Signal Patterns from Temporal Signal Sequence. 15th International Conference on Pattern Recognition, Barcelona, Spain, Sep 3-7, 2000.

Examples:

(1) Automatically model and extract temporal pattern from a mouse trajectory. The user uses mouse to draw two patterns. The mouse trajectory is uniformly sampled.

(1.1) The data

(a) The first pattern.

(b) The second pattern.

(c) Click to see a video showing how the data was captured.

(d) The x-trajectory of some mouse trajectory segement.


(1.2) The intermediate and final results of mouse pattern extraction are shown as following.

(1.2.1) Local search results: (Click to see a video clip showing an example of the local search)

(1.2.2) Global search results

After eight iteration, the algorithm extracted the results as below.

Data samples corresponding to the extracted pattern.

(2)Automatically extract and associate temporal patterns from two synchronized temporal signal sequences. One of the signal sequence is the mouse trajectory, which is captured while draw the two pattern shown above. The other one is the annotation sound track, which is recorded synchronously with the mouse trajectory.

(a) Channel 1: The x coordinate sequence of the mouse trajectory.

(b) Channel 2: The synchronized audio annotation track.

(c) The extracted mouse trajectory patterns

(d) The extracted audio patterns

The links between the patterns extracted from two different modalities are established according to the temporally information. The sound segment "pattern1" is associated with the mouse trajectory segments corresponding to "pattern1". And same to "pattern2". Therefore, it leads to automatic concept learning.