You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
PyNumDiff is a Python package that implements various methods for computing numerical derivatives of noisy data, which
43
-
can be a critical step in developing dynamic models or designing control. There are four different families of methods
44
-
implemented in this repository: smoothing followed by finite difference calculation, local approximation with linear
45
-
models, Kalman filtering based methods and total variation regularization methods. Most of these methods have multiple
46
-
parameters involved to tune. We take a principled approach and propose a multi-objective optimization framework for
47
-
choosing parameters that minimize a loss function to balance the faithfulness and smoothness of the derivative estimate.
48
-
For more details, refer to [this paper](https://doi.org/10.1109/ACCESS.2020.3034077).
24
+
PyNumDiff is a Python package that implements various methods for computing numerical derivatives of noisy data, which can be a critical step in developing dynamic models or designing control. There are seven different families of methods implemented in this repository:
25
+
26
+
1. convolutional smoothing followed by finite difference calculation
27
+
2. polynomial-fit-based methods
28
+
3. iterated finite differencing
29
+
4. total variation regularization of a finite difference derivative
30
+
5. Kalman (RTS) smoothing
31
+
6. Fourier spectral with tricks
32
+
7. linear local approximation with linear model
33
+
34
+
Most of these methods have multiple parameters, so we take a principled approach and propose a multi-objective optimization framework for choosing parameters that minimize a loss function to balance the faithfulness and smoothness of the derivative estimate. For more details, refer to [this paper](https://doi.org/10.1109/ACCESS.2020.3034077).
35
+
36
+
## Installing
37
+
38
+
Dependencies are listed in [pyproject.toml](https://github.com/florisvb/PyNumDiff/blob/master/pyproject.toml). They include the usual suspects like `numpy` and `scipy`, but also optionally `cvxpy`.
39
+
40
+
The code is compatible with >=Python 3.10. Install from PyPI with `pip install pynumdiff`, from source with `pip install git+https://github.com/florisvb/PyNumDiff`, or from local download with `pip install .`. Call `pip install pynumdiff[advanced]` to automatically install optional dependencies from the advanced list, like [CVXPY](https://www.cvxpy.org).
41
+
42
+
## Usage
43
+
44
+
For more details, read our [Sphinx documentation](https://pynumdiff.readthedocs.io/master/). The basic pattern of all differentiation methods is:
49
45
50
-
## Structure
46
+
```python
47
+
somethingdiff(x, dt, **kwargs)
48
+
```
49
+
50
+
where `x` is data, `dt` is a step size, and various keyword arguments control the behavior. Some methods support variable step size, in which case `dt` can also receive an array of values to denote sample locations.
If no `search_space_updates` is given, a default search space is used. See the top of `_optimize.py`.
74
+
75
+
The following heuristic works well for choosing `tvgamma`, where `cutoff_frequency` is the highest frequency content of the signal in your data, and `dt` is the timestep: `tvgamma=np.exp(-1.6*np.log(cutoff_frequency)-0.71*np.log(dt)-5.1)`. Larger values of `tvgamma` produce smoother derivatives. The value of `tvgamma` is largely universal across methods, making it easy to compare method results. Be aware the optimization is a fairly heavy process.
76
+
77
+
### Notebook examples
78
+
79
+
Much more extensive usage is demonstrated in Jupyter notebooks:
80
+
* Differentiation with different methods: [1_basic_tutorial.ipynb](https://github.com/florisvb/PyNumDiff/blob/master/examples/1_basic_tutorial.ipynb)
81
+
* Parameter Optimization with known ground truth (only for demonstration purpose): [2a_optimizing_parameters_with_dxdt_known.ipynb](https://github.com/florisvb/PyNumDiff/blob/master/examples/2a_optimizing_parameters_with_dxdt_known.ipynb)
82
+
* Parameter Optimization with unknown ground truth: [2b_optimizing_parameters_with_dxdt_unknown.ipynb](https://github.com/florisvb/PyNumDiff/blob/master/examples/2b_optimizing_parameters_with_dxdt_unknown.ipynb)
-`docs/` contains `make` files and `.rst` files to govern the way `sphinx` builds documentation, either locally by navigating to this folder and calling `make html` or in the cloud by `readthedocs.io`.
@@ -82,7 +117,6 @@ See CITATION.cff file as well as the following references.
82
117
journal = {Journal of Open Source Software}
83
118
}
84
119
85
-
86
120
### Optimization algorithm:
87
121
88
122
@article{ParamOptimizationDerivatives2020,
@@ -93,86 +127,16 @@ See CITATION.cff file as well as the following references.
93
127
year={2020}
94
128
}
95
129
96
-
## Getting Started
97
-
98
-
### Prerequisite
99
-
100
-
PyNumDiff requires common packages like `numpy`, `scipy`, and `matplotlib`. For a full list, you can check the file [pyproject.toml](https://github.com/florisvb/PyNumDiff/blob/master/pyproject.toml)
101
-
102
-
In addition, it also requires certain additional packages for select functions, though these are not required for a successful install of PyNumDiff:
103
-
- Total Variation Regularization methods: [`cvxpy`](http://www.cvxpy.org/install/index.html)
104
-
-`pytest` for unittests
105
-
106
-
### Installing
107
-
108
-
The code is compatible with >=Python 3.5. It can be installed using pip or directly from the source code. Basic installation options include:
109
-
110
-
* From PyPI using pip: `pip install pynumdiff`.
111
-
* From source using pip git+: `pip install git+https://github.com/florisvb/PyNumDiff`
112
-
* From local source code using setup.py: Run `pip install .` from inside this directory. See below for example.
113
-
114
-
Call `pip install pynumdiff[advanced]` to automatically install [CVXPY](https://www.cvxpy.org) along with PyNumDiff. <em>Note: Some CVXPY solvers require a license, like ECOS and MOSEK. The latter offers a [free academic license](https://www.mosek.com/products/academic-licenses/).</em>
115
-
116
-
## Usage
117
-
118
-
**PyNumDiff** uses [Sphinx](http://www.sphinx-doc.org/en/master/) for code documentation, so read more details about the API usage [there](https://pynumdiff.readthedocs.io/master/).
params, val = optimize(method, x, dt, search_space={'param1':[options], 'param2':[options], ...},
147
-
tvgamma=tvgamma)
148
-
print('Optimal parameters: ', params)
149
-
x_hat, dxdt_hat = method(x, dt, **params)
150
-
```
151
-
152
-
### Notebook examples
153
-
154
-
We will frequently update simple examples for demo purposes, and here are currently exisiting ones:
155
-
* Differentiation with different methods: [1_basic_tutorial.ipynb](https://github.com/florisvb/PyNumDiff/blob/master/examples/1_basic_tutorial.ipynb)
156
-
* Parameter Optimization with known ground truth (only for demonstration purpose): [2a_optimizing_parameters_with_dxdt_known.ipynb](https://github.com/florisvb/PyNumDiff/blob/master/examples/2a_optimizing_parameters_with_dxdt_known.ipynb)
157
-
* Parameter Optimization with unknown ground truth: [2b_optimizing_parameters_with_dxdt_unknown.ipynb](https://github.com/florisvb/PyNumDiff/blob/master/examples/2b_optimizing_parameters_with_dxdt_unknown.ipynb)
158
-
159
-
### Important notes
160
-
161
-
* Larger values of `tvgamma` produce smoother derivatives
162
-
* The value of `tvgamma` is largely universal across methods, making it easy to compare method results
163
-
* The optimization is not fast. Run it on subsets of your data if you have a lot of data. It will also be much faster with faster differentiation methods, like `savgoldiff` and `butterdiff`.
164
-
* The following heuristic works well for choosing `tvgamma`, where `cutoff_frequency` is the highest frequency content of the signal in your data, and `dt` is the timestep: `tvgamma=np.exp(-1.6*np.log(cutoff_frequency)-0.71*np.log(dt)-5.1)`
165
-
166
-
### Running the tests
130
+
## Running the tests
167
131
168
132
We are using GitHub Actions for continuous intergration testing.
169
133
170
-
To run tests locally, type:
134
+
Run tests locally by navigating to the repo in a terminal and calling
171
135
```bash
172
-
> pytest pynumdiff
136
+
> pytest -s
173
137
```
174
138
175
-
Add the flag `--plot` to see plots of the methods against test functions. Add the flag `--bounds` to print log error bounds (useful when changing method behavior).
139
+
Add the flag `--plot` to see plots of the methods against test functions. Add the flag `--bounds` to print $\log$ error bounds (useful when changing method behavior).
0 commit comments