Gromov-Wasserstein is an Optimal Transport metric designed to align heterogeneous distributions. We have shown it could be used to compare graphs and proposed a sliced version to overcome its computational burden.
Machine Learning & Optimal Transport
Machine Learning & Time Series
This section gathers Machine Learning tools dedicated to time series with no specific focus on environmental data.
Related funded project
Related source code
- tslearn: A machine learning toolkit dedicated to time-series data
- Learning DTW-Preserving Shapelets
- Cost-Aware Early Classification of Time Series
- Dense Bag-of-Temporal-SIFT-Words (Adeline Bailly's code)
Related papers
Machine Learning for Earth Observation
A lot of earth observation data are timestamped. Designing ML techniques that can handle this time dimension can often lead to much improved performance. We have so far turned our focus on 3 different types of environmental data: chemistry data in streams, remote sensing data (such as satellite image time series) and ship trajectory data.
Related funded project
Related datasets
- Ushant AIS dataset: a dataset of ship trajectories in the Ushant traffic separation scheme (in Brittany, West of France)
- GEE-TSDA: a remote sensing dataset to evaluate domain adaptation on time series
Related papers
Time-Sensitive Graphical Models
We have been using time-sensitive topic models (such as Probabilistic Latent Semantic Motifs or Hierarchical Dirichlet Latent Semantic Motifs) to perform action recognition in videos. More recently, we have turned our focus towards the design of richer models to better model continuous processes in continuous time.
Related papers
Indexing & IR [Past]
Our main goal in this project was to introduce new indexing schemes that were able to efficiently deal with time series. One contribution in this field was iSAX+, an approximate-lower-bound-based indexing scheme for DTW. Some works about vector data indexing are also cited here.
Related papers
Time Series Mining for Smart Environments [Past]
The growing use of lots of low-level sensors instead of few higher-level ones implies the use of dedicated pattern extraction methods. To do so, we have worked on the already existing T-patterns algorithm so that it can efficiently scale up to larger volumes of data.