Implementation of the physics informed generator learning using kernel methods#10
Implementation of the physics informed generator learning using kernel methods#10DevergneTimothee wants to merge 8 commits intomainfrom
Conversation
pietronvll
left a comment
There was a problem hiding this comment.
Thanks for the PR @DevergneTimothee! In addition to the comments about naming conventions and docs issues, I think that we should create a submodule only focussed on dynamics, where @vladi-iit can add all the other code he wrote for the Laplace approach and for the transfer operator.
Specifically, we can add a dynamics subfolder inside the kernel module, where you can have regressors, utils, and structs (if needed) specifically for dynamics.
Following this, we can update the docs so to have an API link directly to kernel/dynamics on the sidebar.
| return result | ||
|
|
||
|
|
||
| def eig_physics_informed( |
There was a problem hiding this comment.
Why do we need a specific function to compute the eigendecomposition? Check with @vladi-iit if you can write a single function to compute the spectral decomposition of dynamical operators (generators, transfer operators, koopman)
There was a problem hiding this comment.
The shape of the kernel matrix is different, but this is resolved when using a block matrix, so it can be unified. In the future I might add a utils function that computes this block matrix.
| return np.linalg.multi_dot([rsqrt_dim * K_Xin_X_or_Y, vr_or_vl]) | ||
|
|
||
|
|
||
| def evaluate_right_eigenfunction_physics_informed( |
There was a problem hiding this comment.
Same comment as for the eig function above.
There was a problem hiding this comment.
Everything is now unified, I also started adding @vladi-iit code into the branch for future Laplace integration
| return result | ||
|
|
||
|
|
||
| def reduced_rank_regression_physics_informed( |
There was a problem hiding this comment.
Here and in the other functions you've implemented, I don't like the suffix "physics_informed" as it is too vague. What about dirichlet_reduced_rank?
There was a problem hiding this comment.
I agree, I changed it to Dirichlet
| kernel_X: np.ndarray, | ||
| X: np.ndarray, | ||
| sigma: float, | ||
| friction: np.ndarray, |
There was a problem hiding this comment.
The friction argument here is strange, as it is not a parameter of the kernel. Can't we remove it from here and add it back as an argument of the regressor?
| return (int(i), int(j)) | ||
|
|
||
|
|
||
| def return_phi_dphi( |
There was a problem hiding this comment.
Instead of having this function in utils, I propose to do the following:
- Create a
kernels.pyfile - Add these functions as methods of a sublass of
sklearn's Gaussian kernel, like that: - Slightly changing the name for clarity
# linear_operator_learning/kernel/dynamics/kernels.py
from sklearn.gaussian_process.kernels import RBF
class RBF_with_grad(RBF):
def __init__(...):
# Init code here:
def grad(X, ...):
# Code from return phi_dphi
def grad2(X, ...):
# Code from return_dphi_dphiThere was a problem hiding this comment.
Implemented this, but I left friction as a parameter as it is a bit more complex than just a multiplying factor
…r_learning into physics_informed
Added Kernels class
Normally, I updated the doc. Let me know if you see anything or want me to change something.