I have become interested in something rather irrelavant to my PhD and started this blog because of it. How come?

Here, I will tell my personal story how I came to my passion about data-driven dynamics and the training I have been acquiring.

## What is Data-Driven Dynamics?

Data-driven dynamic, in my perspective, is about distilling ‘physical law’ purely from data. Once a friend joked about its role for liberating physicist from ever to derive equations again. To begin with, this tool has proved its power in solving the well-known physical laws, even in chaotic systems, like the double-pendulum and lorenz attractor. It also scales to high-dimensional data when researchers harnessed the sparsity of natural laws.

The more exciting aspect is to discover new laws, which is particularly helpful for neuroscience and other biological systems, climate science and economics, because we lack the fundamental understanding of the system. For example, the algorithm may be able to derive the ‘law’ of brain by analysing the EEG data readily available.

Although it is a cliché, I have to admit that I am motivated by how much it can contribute in the Big Data era. The field dynamical system and control is well-established; there are flourishing research in machine learning that process data at massive scale and in novel ways. Data-driven dynamics could leverage the knowledge from the conventional field, combine with new machine learning tools, to make sense of the vast data.

## How I became interested?

It was a difficult time of my PhD when I had loads of data but was struggling to interpret them. I came to know the great tool principle component analysis (PCA) (called proper orthogonal decomposition in fluid dynamics). Out of the seemingly random data, beautiful patterns emerge from applying PCA. It triggered a positive feedback loop of learning and applying. I then naturally become interested in dynamic mode decomposition (DMD), which combines PCA with Fourier transform, another extremely powerful tool.

I could not fully understand the algorithm so I started building my vocabulary via self-paced learning. After taking the courses on linear dynamical system and control, mechanical engineering analysis, and chaos theory, I became deeply intrigued by the elegant interprations using linear algebra and its profound implication for engineering.

During this time, I tried using DMD and its variants on data and gained a new, brilliant perspective on turbulence. Moreover, getting to know the fundamentals helps me to appreciate a bigger picature, especially where the tools I am learning would be useful.

## Getting serious?

I am exploring this research area, especially the major analysis tools, on my own. I try to balance the searching, learning, implementation and exploitation (e.g., writing my tech blog). The approach that works particularly well so far, is

- Search curiously.
- Narrow down an interesting topic, read and take notes.
- Pick a good paper and take good notes. Implement the algorithm on toy problems first and play with it.
- If possible, implement relevant techniques, compare and learn.

Most often, I identified holes in my knowledge and the following learning would be really rewarding, such as picking up the basics of nonlinear time-series analysis.

I had an interesting dataset to begin with and used this approach to learn (1) spectral-POD, (2) the numerous variants of DMD, and (3) koopman mode decomposition. Then I explored sparse sensing out of interests.

## Conclusion

I do not recommend this way of doing research, because one needs proper training during their PhD. However, I have benefitted immensely from this learning experience. The lessons I have learnt are:

- To take ownership of my research
- Great works take time and effort, (and are risky,) so they take cycles.
- To go through the cycles calls for plans with deliverables (e.g., report, mini-project) to keep me focused and track the progress.
- Build positive feedback loop.

I got taught by amazing lecturers over the world and met one of my idols because of my passion. I hope to share with the public about this exciting field through my blog and I do appreciate any request or contribution.

## Apendices

### Online courses:

**Introduction to Linear Dynamical Systems** EE263 Stanford, Video lectures by Stephen Boyd

Great lecturer! The course work is really useful. Being able to see the meaning of the linear algebra operations helps my understanding.

**Control Bootcamp** by Steve Brunton, University of Washington

Amazing lecturer with brand-new way of lecturing! Really clear teaching and plenty of implementations to play with.

**Nonlinear Dynamics and Chaos** by Steven Strogatz, Cornell University

The lectures follow his book (beautifully written!) and I have mainly relied on his book. His teaching leads you slowly into the difficult concepts and explained things really well. I wish I had began studying chaos, especially bifurcation, from this book!

**Mechanical Engineering Analysis**: ME564 and
ME565 by Steve Brunton, University of Washington

Really rigorous and clear teaching on how to solve ODE and PDE. More relevant to engineers.

**Nonlinear Dynamics: Geometry of Chaos** by Predrag Cvitanović from.
Georgia Institute of Technology.

The courses accompany the chaos book and is mathematically very challenging.

### Research groups

In my opinion, the Kutz group and Brunton Lab and their co-workers at University of Washington are leading the frontier in this direction. The other researchers I identified are Hayden Schaeffer from CMU and Maziar Raissi from Brown University.

### Future plan

The tools that I am most familiar with are DMD and compressed sensing. Therefore, I will exploit these advantages, get a few more miniprojects done before applying for jobs. The other interesting topics via the ‘curious’ search include

- Using various machine learning techniques to model chaos theory
- Predictive model based on chaos theory (e.g., work by Chris Danfort)
- The synergy between dynamical system and work from Yutian Chen and Max Welling on herded Gibbs sampling is also intriguing.