Skip to content

Commit 6c9de67

Browse files
committed
Clarify mathematical notation and add more SSPOC details
1 parent f1493b5 commit 6c9de67

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

docs/JOSS/paper.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -25,7 +25,7 @@ affiliations:
2525
index: 3
2626
- name: Department of Biology, University of Washington
2727
index: 4
28-
date: 24 October 2020
28+
date: 13 February 2021
2929
bibliography: paper.bib
3030
---
3131

@@ -52,10 +52,10 @@ In addition to these two packages, researchers and practitioners have made avail
5252
Currently, researchers seeking to employ modern sensor placement methods must choose between implementing them from scratch or manually augmenting existing unpolished codes.
5353

5454
Reconstruction and classification tasks often arise in the modeling, prediction, and control of complex processes in geophysics, fluid dynamics, biology, and manufacturing.
55-
The goal of reconstruction is to recover a high-dimensional signal $\mathbf{x}\in\mathbb{R}^N$ from a limited number of $p$ measurements $\mathbf{y}_ i = \mathbf{c}_ i^\top \mathbf{x}$, where each $\mathbf{c}_ i$ represents the action of a sensor. `PySensors` optimizes a set of $p$ sensors out of $N$ candidates $\mathbf{c}_ i^\top$ (rows of a measurement matrix $\mathbf{C}:\mathbf{y} = \mathbf{Cx}$) that minimize reconstruction error in a data-dependent basis $\mathbf{\Phi}\in\mathbb{R}^{N\times r}$
56-
$$ \mathbf{C}_ \star= \underset{\mathbf{C}\in\mathbb{R}^{p\times N}}{\arg\min} \|\mathbf{x} - \mathbf{\Phi}(\mathbf{C\Phi})^{\dagger} \mathbf{y}\|_ 2^2, $$
57-
where $\dagger$ denotes the Moore-Penrose pseudoinverse. The key innovation is to recover the low-dimensional representation $\mathbf{x}_ r: \mathbf{x} = \mathbf{\Phi x}_ r$ via the reconstruction map $\mathbf{\Phi}(\mathbf{C\Phi})^{\dagger}$, ultimately reducing sensor placement to a highly efficient matrix pivoting operation [@manohar2018data]. Similarly, sensor placement for classification [@brunton2016sparse] optimizes the sparsest vector $\mathbf{s}_ \star$ that reconstructs interclass decision boundaries $\mathbf{w}: \mathbf{\Phi}^\top\mathbf{s} = \mathbf{w}$ in the low-dimensional feature space.
58-
In this case, the optimal sensor locations are determined by the nonzero components of $\mathbf{s}_ \star$.
55+
The goal of _reconstruction_ is to recover a high-dimensional signal $\mathbf{x}\in\mathbb{R}^N$ from a limited number of $p$ measurements $y_ i = \mathbf{c}_ i^\top \mathbf{x}$, where each $\mathbf{c}_ i \in \mathbb{R}^N$ represents the action of a sensor. For example, $\mathbf{c}_ i^\top = [1, 0, 0, \dots, 0]$ represents a sensor which takes a point measurement of the first dimension of the signal $\mathbf{x}$. `PySensors` selects a set of $p$ sensors out of $N$ candidates $\mathbf{c}_ i^\top$ (rows of a measurement matrix $\mathbf{C}:\mathbf{y} = \mathbf{Cx}$) that minimize reconstruction error in a data-dependent basis $\mathbf{\Phi}\in\mathbb{R}^{N\times r}$
56+
$$ \mathbf{C}^ \star= \underset{\mathbf{C}\in\mathbb{R}^{p\times N}}{\arg\min} \|\mathbf{x} - \mathbf{\Phi}(\mathbf{C\Phi})^{\dagger} \mathbf{y}\|_ 2^2, $$
57+
where $\dagger$ denotes the Moore-Penrose pseudoinverse. The key innovation is to recover the low-dimensional representation $\mathbf{x}_ r \in \mathbb{R}^r$ satisfying $\mathbf{x} = \mathbf{\Phi x}_ r$ via the reconstruction map $\mathbf{\Phi}(\mathbf{C\Phi})^{\dagger}$, ultimately reducing sensor placement to a highly efficient matrix pivoting operation [@manohar2018data]. Similarly, sensor placement for _classification_ [@brunton2016sparse] optimizes the sparsest vector $\mathbf{s}^ \star$ that reconstructs $\mathbf{w}: \mathbf{\Phi}^\dagger\mathbf{s} = \mathbf{w}$ in the low-dimensional feature space, where $\mathbf{w}$ is the the set of weights learned by a linear classifier fit to $\mathbf{x}_ r$.
58+
In this case, the optimal sensor locations are determined by the nonzero components of $\mathbf{s}^ \star$.
5959

6060
The basis $\mathbf{\Phi}$ is explicitly computed from the data using powerful dimensionality reduction techniques such as principal components analysis (PCA) and random projections, which enable significant compression of most signals to $r\ll N$ dimensions. PCA extracts the dominant spatial correlations or _principal components_, the leading eigenvectors of the data covariance matrix. It is computed using the matrix singular value decomposition (SVD) and is closely related to proper orthogonal decomposition (POD); POD modes and principal components are equivalent.
6161
Other basis choices are possible, such as dynamic mode decomposition for extracting temporally correlated features [@manohar2019optimized].

0 commit comments

Comments
 (0)