Skip to content

Implementation of the physics informed generator learning using kernel methods#10

Open
DevergneTimothee wants to merge 8 commits intomainfrom
physics_informed
Open

Implementation of the physics informed generator learning using kernel methods#10
DevergneTimothee wants to merge 8 commits intomainfrom
physics_informed

Conversation

@DevergneTimothee
Copy link

  1. Reduced rank regression (reduced_rank_regression_physics_informed in kernels/regressors.py)
  2. Eigenfunction utilities (evaluate_right_eigenfunctions_physics_informed (the left one is the same as for other methods and it seemed a mess to implement both in the same function) in kernels/linalg.py)
  3. Prediction (predict_physics_informed in kernels/regressors.py)
  4. Some derivative utilities, only for gaussian kernels in utils.py

Normally, I updated the doc. Let me know if you see anything or want me to change something.

Copy link
Contributor

@pietronvll pietronvll left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Thanks for the PR @DevergneTimothee! In addition to the comments about naming conventions and docs issues, I think that we should create a submodule only focussed on dynamics, where @vladi-iit can add all the other code he wrote for the Laplace approach and for the transfer operator.

Specifically, we can add a dynamics subfolder inside the kernel module, where you can have regressors, utils, and structs (if needed) specifically for dynamics.

Following this, we can update the docs so to have an API link directly to kernel/dynamics on the sidebar.

return result


def eig_physics_informed(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why do we need a specific function to compute the eigendecomposition? Check with @vladi-iit if you can write a single function to compute the spectral decomposition of dynamical operators (generators, transfer operators, koopman)

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The shape of the kernel matrix is different, but this is resolved when using a block matrix, so it can be unified. In the future I might add a utils function that computes this block matrix.

return np.linalg.multi_dot([rsqrt_dim * K_Xin_X_or_Y, vr_or_vl])


def evaluate_right_eigenfunction_physics_informed(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Same comment as for the eig function above.

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Everything is now unified, I also started adding @vladi-iit code into the branch for future Laplace integration

return result


def reduced_rank_regression_physics_informed(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here and in the other functions you've implemented, I don't like the suffix "physics_informed" as it is too vague. What about dirichlet_reduced_rank?

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I agree, I changed it to Dirichlet

kernel_X: np.ndarray,
X: np.ndarray,
sigma: float,
friction: np.ndarray,
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The friction argument here is strange, as it is not a parameter of the kernel. Can't we remove it from here and add it back as an argument of the regressor?

return (int(i), int(j))


def return_phi_dphi(
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Instead of having this function in utils, I propose to do the following:

  1. Create a kernels.py file
  2. Add these functions as methods of a sublass of sklearn's Gaussian kernel, like that:
  3. Slightly changing the name for clarity
# linear_operator_learning/kernel/dynamics/kernels.py

from sklearn.gaussian_process.kernels import RBF

class RBF_with_grad(RBF):
    def __init__(...):
        # Init code here:
    def grad(X, ...):
        # Code from return phi_dphi
    def grad2(X, ...):
        # Code from return_dphi_dphi

Copy link
Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Implemented this, but I left friction as a parameter as it is a bit more complex than just a multiplying factor

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants