Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

DeepONet tutorial for Advection Problem #217

Open
wants to merge 3 commits into
base: 0.2
Choose a base branch
from

Conversation

dario-coscia
Copy link
Collaborator

In this PR I uploaded a new tutorial for the DeepONet class since it was missing.

When writing the tutorial I noticed some bugs which I corrected.
Here I report a list of the bugs:

  • The aggregation (MIONet._aggregator) had the problem that the input was already stacked, but sometimes it is not possible to stack (e.g. net1 output shape [T, N, d], net2 output shape [T, d] and aggregation performs einsum('TNd,Td->TNd', ...)). So to avoid this problem now the MIONet._aggregator function takes as input a tuple of tensors (also more simple for the user to understand).
  • I replaced the .reshape with .unsqueeze in output_ = self._reduction(aggregated).reshape(-1, 1) since the final aim of the operation is to add an extra dimension, and not reshape the tensor.

Finally, I remove the check consistency of the network outputs before aggregation and put a warning message that the consistency is up to the user to check. This is not because I believe it should be up to the user the check, but it is a permanent fix since the way we were checking the consistency is wrong. Indeed consider what we were doing:

# check trunk branch nets consistency
shapes = []
for key, value in networks.items():
    check_consistency(value, (str, int))
    check_consistency(key, torch.nn.Module)
    input_ = torch.rand(10, len(value))
    shapes.append(key(input_).shape[-1])

if not all(map(lambda x: x == shapes[0], shapes)):
    raise ValueError('The passed networks have not the same '
                     'output dimension.')

This is wrong if the user wants to specify a custom network whose inputs is not in the form [N, len(value)]. For example, consider the following net:

class Net(torch.nn.Module):
    def __init__(self):
        super().__init__()
        self.layer = torch.nn.Linear(2, 100)
    def forward(self, x):
        x_ = x[:, 0, :]
        return self.layer(x_)

Due to the input reshape the line shapes.append(key(input_).shape[-1]) will raise an error since we are trying to slice a two dimensional tensor (input_ = torch.rand(10, len(value))) like a three dimensional one (line x[:, 0, :]).

The problem to check network consistency is more general than just MIONet or DeepONet, as also in FNO we have the same problem. Maybe we can think to make a PR just for this problem.

@dario-coscia dario-coscia self-assigned this Nov 28, 2023
@dario-coscia dario-coscia added the pr-to-review Label for PR that are ready to been reviewed label Nov 28, 2023
Comment on lines +24 to +36
# This test is wrong! The user could define a custom network and do a
# reshape at the end! A more general way to check input and output is
# needed
# def test_constructor_fails_when_invalid_inner_layer_size():
# branch_net = FeedForward(input_dimensions=1, output_dimensions=10)
# trunk_net = FeedForward(input_dimensions=2, output_dimensions=8)
# with pytest.raises(ValueError):
# DeepONet(branch_net=branch_net,
# trunk_net=trunk_net,
# input_indeces_branch_net=['a'],
# input_indeces_trunk_net=['b', 'c'],
# reduction='+',
# aggregator='*')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I do not understand why the check is removed. What does it mean The user could define a custom network and do a reshape at the end?

Comment on lines -115 to +132
# check trunk branch nets consistency
shapes = []
for key, value in networks.items():
check_consistency(value, (str, int))
check_consistency(key, torch.nn.Module)
input_ = torch.rand(10, len(value))
shapes.append(key(input_).shape[-1])
# TODO: Fix it does not work
# # check trunk branch nets consistency
# shapes = []
# for key, value in networks.items():
# check_consistency(value, (str, int))
# check_consistency(key, torch.nn.Module)
# input_ = torch.rand(10, len(value))
# shapes.append(key(input_).shape[-1])

if not all(map(lambda x: x == shapes[0], shapes)):
raise ValueError('The passed networks have not the same '
'output dimension.')
# if not all(map(lambda x: x == shapes[0], shapes)):
# raise ValueError('The passed networks have not the same '
# 'output dimension.')
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why disabling checks about the shapes?

@ndem0 ndem0 added pr-to-fix Label for PR that needs modification and removed pr-to-review Label for PR that are ready to been reviewed labels Nov 29, 2023
@dario-coscia
Copy link
Collaborator Author

The comments are made because the checks are just correct for 2D inputs. I commented because it is not still clear what is the data format for all models (2d/3d?).

@dario-coscia
Copy link
Collaborator Author

@ndem0 can we merge or do we wait for fixing the input format?

@ndem0 ndem0 changed the base branch from master to 0.2 February 8, 2024 11:15
@dario-coscia dario-coscia added the v0.2 implementation in v0.2 label Feb 9, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
pr-to-fix Label for PR that needs modification v0.2 implementation in v0.2
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

2 participants