## Merging into `master` PR https://github.com/gridap/GridapDistributed.jl/pull/50 - [x] Update README - [x] remove src/OLD, test/OLD, compile - [x] Add NEWS.md ## High priority - [x] Release Gridap 0.17.0 - [x] Release PartitonedArrays 0.2.3 - [x] Write data into `pvtu` format (@amartinhuertas) PR https://github.com/gridap/WriteVTK.jl/pull/1 https://github.com/gridap/GridapDistributed.jl/pull/46 - [x] Create MPI launchers for the test (now we just test in serial mode) @fverdugo https://github.com/gridap/GridapDistributed.jl/pull/45 - [x] Assembly of the rhs vector alone (@amartinhuertas) PR https://github.com/gridap/GridapDistributed.jl/pull/40 - [x] Solve broken tests in `FESpacesTests.jl` namely `assemble_XXX!` functions do not work @fverdugo https://github.com/gridap/GridapDistributed.jl/pull/44 - [x] Implement assembly strategy when one also iterates over ghost cells. - [x] Interpolation onto distributed multi-field spaces. @fverdugo PR https://github.com/gridap/GridapDistributed.jl/pull/42 - [x] Port uniformly refined forest of octrees distributed model this new version (in [GridapP4est](https://github.com/gridap/GridapP4est.jl)) @amartinhuertas - [x] Add test drivers including DG examples. @fverdugo PR https://github.com/gridap/GridapDistributed.jl/pull/42 - [x] Add test drivers including nonlinear examples. @fverdugo PR https://github.com/gridap/GridapDistributed.jl/pull/44 - [x] https://github.com/fverdugo/PartitionedArrays.jl/issues/33 https://github.com/fverdugo/PartitionedArrays.jl/pull/35 - [x] `@assert`s locally triggered from the processors may lead to deadlock! We should call `mpi_abort` whenever an `AssertionError` exception is triggered. PR https://github.com/fverdugo/PartitionedArrays.jl/pull/39 - [ ] AD in parallel. Talk to @amartinhuertas to grasp the challenges there. ## Medium priority - [x] Nicer user API for PETScSolver (to be done in GridapPETSc). In particular, better handling of the lifetime of PETSc objects. - [x] Implement distributed models from gmsh (in GridapGmsh) @fverdugo - [x] implement `lu!` for PSparseMatrix following the same strategy as with \ - [x] Move mpi tests to their own CI job @amartinhuertas PR https://github.com/gridap/GridapDistributed.jl/pull/47 - [x] Automatically generate sysimage to reduce mpi tests runtime @amartinhuertas PR https://github.com/gridap/GridapDistributed.jl/pull/48 - [ ] Think how the user can define the local and global vector type in the FESpace - [ ] Showing `NLSolve.jl` solver trace only in master MPI rank leads to a dead lock. - [ ] Peridodic BCs - [ ] Implement ZeroMeanFESpace ## Low priority - [ ] Implement a more lazy initialization of the matrix exchanger in PartitionedArrasys since the exchanger is not always needed. - [x] Overlap compression of the sparse matrix with communication of rhs assembly - [ ] Implement another strategy to represent local matrices in PartitionedArrays with a data layout compatible with petsc - [x] `interpolate_everywhere` not available in `GridapDistributed.jl`, only `interpolate` Solved in PR https://github.com/gridap/GridapDistributed.jl/pull/74
Merging into
masterPR #50High priority
pvtuformat (@amartinhuertas) PR Support parallel file formats WriteVTK.jl#1 PVTU files #46FESpacesTests.jlnamelyassemble_XXX!functions do not work @fverdugo Nonlinear test (and misc associated fixes) #44MPI.COMM_WORLDinget_part_ids(b::MPIBackend,...PartitionedArrays/PartitionedArrays.jl#33 Duplicating communicator in MPI backend PartitionedArrays/PartitionedArrays.jl#35@asserts locally triggered from the processors may lead to deadlock! We should callmpi_abortwhenever anAssertionErrorexception is triggered. PR Improving prun() for MPIBackend. Adding prun_debug() for debugging purposes. PartitionedArrays/PartitionedArrays.jl#39Medium priority
lu!for PSparseMatrix following the same strategy as with \NLSolve.jlsolver trace only in master MPI rank leads to a dead lock.Low priority
interpolate_everywherenot available inGridapDistributed.jl, onlyinterpolateSolved in PR Dirichlet values #74