-
Notifications
You must be signed in to change notification settings - Fork 66
fix(mode): fixed bug where mode solver was not respecting location of PEC #3030
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: develop
Are you sure you want to change the base?
fix(mode): fixed bug where mode solver was not respecting location of PEC #3030
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
3 files reviewed, 1 comment
Diff CoverageDiff: origin/develop...HEAD, staged and unstaged changes
Summary
tidy3d/components/mode/mode_solver.pyLines 1676-1684 1676 ) -> tuple[list[float], list[dict[str, ArrayComplex4D]], list[EpsSpecType]]:
1677 """Call the mode solver at all requested frequencies."""
1678 if tidy3d_extras["use_local_subpixel"]:
1679 subpixel_ms = tidy3d_extras["mod"].SubpixelModeSolver.from_mode_solver(self)
! 1680 return subpixel_ms._solve_all_freqs(
1681 coords=coords,
1682 symmetry=symmetry,
1683 )tidy3d/components/mode/solver.pyLines 162-174 162 eps_cross = cls._truncate_medium_data(eps_cross, cell_min, cell_max)
163
164 # Truncate mu_cross if provided
165 if mu_cross is not None:
! 166 mu_cross = cls._truncate_medium_data(mu_cross, cell_min, cell_max)
167
168 # Truncate split_curl_scaling if provided
169 if split_curl_scaling is not None:
! 170 split_curl_scaling = tuple(
171 s[cell_min[0] : cell_max[0], cell_min[1] : cell_max[1]]
172 for s in split_curl_scaling
173 )Lines 1132-1144 1132
1133 if isinstance(mat_data, Numpy):
1134 # Tensorial format: shape (9, Nx, Ny)
1135 return mat_data[:, x_slice, y_slice]
! 1136 if len(mat_data) == 9:
1137 # Nine separate 2D arrays
! 1138 return tuple(arr[x_slice, y_slice] for arr in mat_data)
1139
! 1140 raise ValueError("Wrong input to mode solver permittivity/permeability truncation!")
1141
1142 @staticmethod
1143 def format_medium_data(mat_data):
1144 """ |
f179f91 to
6bb0d8f
Compare
|
@greptileai another round |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
4 files reviewed, 1 comment
… PEC boundary conditions from the Simulation
d1ada2b to
3df4c17
Compare
| f"Fields will not wrap around periodically." | ||
| ) | ||
| # ABC boundaries: no truncation, no warning (fields can extend) | ||
| return (tuple(pos_min), tuple(pos_max)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So this fix seems to be specific to the case where the enclosing Simulation has e.g. PEC boundaries.
I am wondering if this still excludes problematic cases where the enclosing Simulation is large, but the mode plane itself does not impose the PEC boundaries on the analytical plane boundaries, but instead on whatever the discretized version of the plane turns out to be. You guys might have dealt with something like that already, not sure.
If memory serves me right, at least in the new ModeSimulation class we've made a small step towards handling this correctly in the following sense:
- If the
ModeSimulationis defined with a 3Dcenter,size, as well as aplaneargument, then the inaccuracy is technically there. The plane will be discretized based on the larger 3D "simulation" grid, and the discretization and hence where boundaries are applied may not match the analytical plane definition. - If however the
ModeSimulationis defined with a 2Dsizeand no extraplaneargument, then the grid matches the plane, and the boundaries match the expected location.
So I guess what I'm wondering is if your approach to pass the sim_pec_pos_max, etc. should also be applied in situations like case 1. above. Although I'm not sure what should happen if the analytical plane position just lies somewhere in between the mode plane grid points.. Here your fix works because the boundary location does match a grid point, there just might be an extra pixel, right?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am wondering if this still excludes problematic cases where the enclosing Simulation is large, but the mode plane itself does not impose the PEC boundaries on the analytical plane boundaries, but instead on whatever the discretized version of the plane turns out to be. You guys might have dealt with something like that already, not sure.
Yes, so far I was trying to match existing behavior, but fix the bug. So if the mode plane is smaller than the Simulation plane bounds, there is no change in behavior when compared to develop. It is not clear to me what to do if the mode plane boundaries don't line up with grid boundaries. I think changing the grid is never a good idea, especially since these modes might be injected into an FDTD simulation. And if we choose the closest grid boundary, or say the closest grid boundary that encompasses the mode plane, then basically we have the same problem (adding additional space), just potentially not quite as bad as the one complete cell in the bug.
Here your fix works because the boundary location does match a grid point, there just might be an extra pixel, right?
Correct, sim_pec_pos_max might as well be an index into the grid boundaries. I only want to support the PEC boundary matching an existing grid boundary, because otherwise the mode solver grid would have to change slightly to respect a new location of the PEC boundary.
| only for sides that have supported boundaries. For unsupported boundary types, | ||
| a warning is logged only if the mode solver plane intersects that boundary. | ||
| The mode solver supports: |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The PML in mode solver is handled separately from the FDTD PML. I think previously the PEC was also handled separately in the mode solver vs. the FDTD boundary condition.
It would be nice to unify things, as long as it works in all use cases. Can PML handling be unified in the same way?
I'm not sure about all the new warnings -- I guess they are because those boundary conditions are not supported in the mode solver. But I'm also a bit worried about having the mode solver behave so qualitatively differently depending on the underlying FDTD simulation (PEC vs high-conductivity medium, for example).
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think previously the PEC was also handled separately in the mode solver vs. the FDTD boundary condition.
Yea, that is what was causing issues. In RF simulations you might use a PEC boundary condition to take the place of a ground plane, but the exact location of the PEC boundary in the FDTD simulation and the ModeSolver solve did not line up. So the ModeSolver was calculating modes with effectively one extra cell before the PEC.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I guess the big picture question is what should we do if the ModeSolver plane bounds intersects with the Simulation bounds:
- Override the BCs in the ModeSolver with the Simulation's BCs
- Keep PECs or whatever BCs are set in the ModeSolver
I think the PML handling could also be unified, it seems a little harder since we probably should use all of PML layers from the Tidy3D simulation, instead of the subset of layers that might exist in an intersection of the mode plane and the simulation bounds.
Yes there are a lot of new warnings, but I would say that the current behavior of ignoring the Simulation BCs is ripe for confusion!
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can you try running some simple EME simulations and make sure there aren’t excessive warnings?
I think it's quite common to have mode plane smaller than the simulation domain, and it'll be very useful to fix that too? Regarding closest grid boundary, i guess it should work in most cases, as it's expected to have the mode plane boundary to lie at the metal surface, and metal surface is snapped to grid. |
Greptile Overview
Greptile Summary
This PR fixes a bug where the mode solver grid would extend beyond simulation domain boundaries, causing incorrect PEC boundary condition application. The fix implements grid truncation at simulation boundaries and zero-pads fields outside the domain.
Key changes:
_sim_boundary_positionsmethod to detect PEC boundaries and determine where to truncate the computational gridcompute_modesthat clips coordinates, permittivity, permeability, and basis fields to simulation boundsThe implementation correctly handles the core issue, but there's a minor inconsistency in the boundary intersection detection logic that could cause edge cases.
Confidence Score: 4/5
np.isclosevsfp_epsthreshold) that could cause edge case failures. The core truncation and zero-padding logic appears soundtidy3d/components/mode/mode_solver.pyfor the boundary intersection logic inconsistencyImportant Files Changed
File Analysis
_sim_boundary_positionsmethod to detect PEC boundaries and determine truncation positions. Passes boundary positions to solver for grid truncation. Minor inconsistency in boundary checking logic_truncate_medium_datahelper methodSequence Diagram
sequenceDiagram participant MS as ModeSolver participant MSC as ModeSolver._solve_single_freq participant SBP as ModeSolver._sim_boundary_positions participant CM as compute_modes (solver.py) participant TMD as _truncate_medium_data MS->>MS: User calls .data property MS->>MSC: _solve_single_freq(freq, coords, symmetry) MSC->>SBP: Get PEC boundary positions SBP->>SBP: Check boundary types (PEC, PMC, etc.) SBP->>SBP: Determine if solver grid intersects boundaries SBP-->>MSC: Return (pos_min, pos_max) tuples MSC->>CM: compute_modes(eps_cross, coords, ..., sim_pec_pos_min, sim_pec_pos_max) CM->>CM: Calculate truncation indices from PEC positions CM->>CM: Check if truncation needed (trim_min != [0,0] or trim_max != [Nx,Ny]) alt Truncation needed CM->>CM: Truncate coords arrays CM->>TMD: Truncate eps_cross TMD-->>CM: Truncated eps CM->>TMD: Truncate mu_cross (if present) TMD-->>CM: Truncated mu CM->>CM: Truncate split_curl_scaling (if present) CM->>CM: Truncate solver_basis_fields (if present) end CM->>CM: Solve eigenvalue problem on truncated grid CM->>CM: Reshape fields to (2, 3, Nx_trunc, Ny_trunc, 1, num_modes) alt Truncation was applied CM->>CM: Zero-pad fields back to original size Note over CM: np.pad with trim_min/trim_max offsets end CM-->>MSC: Return (n_complex, fields, eps_spec) MSC-->>MS: Return mode data with correct boundary conditions