Skip to content

Conversation

@dxoigmn
Copy link
Member

@dxoigmn dxoigmn commented Mar 29, 2023

What does this PR do?

This PR updates GradientModifier to be an in-place operation like torch.nn.utils.clip_grad_norm_. This also adds a test to make sure Adversary is properly modifying gradients.

Type of change

Please check all relevant options.

  • Improvement (non-breaking)
  • Bug fix (non-breaking)
  • New feature (non-breaking)
  • Breaking change (fix or feature that would cause existing functionality to not work as expected)
  • This change requires a documentation update

Testing

Please describe the tests that you ran to verify your changes. Consider listing any relevant details of your test configuration.

  • make test
  • CIFAR10_CNN_Adv achieves ~70% accuracy

Before submitting

  • The title is self-explanatory and the description concisely explains the PR
  • My PR does only one thing, instead of bundling different changes together
  • I list all the breaking changes introduced by this pull request
  • I have commented my code
  • I have added tests that prove my fix is effective or that my feature works
  • New and existing unit tests pass locally with my changes
  • I have run pre-commit hooks with pre-commit run -a command without errors

Did you have fun?

Make sure you had fun coding 🙃

@dxoigmn dxoigmn marked this pull request as ready for review March 29, 2023 16:30
@dxoigmn dxoigmn requested a review from mzweilin March 29, 2023 16:30
def __call__(self, grad: torch.Tensor) -> torch.Tensor:
return grad.sign()
def __call__(self, parameters: torch.Tensor | Iterable[torch.Tensor]) -> None:
if isinstance(parameters, torch.Tensor):
Copy link
Contributor

@mzweilin mzweilin Mar 31, 2023

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We wouldn't need a batch-aware GradientModifier if we accept the shared modality-iterable-dispatch mechanism in #115

@dxoigmn
Copy link
Member Author

dxoigmn commented Apr 3, 2023

@mzweilin: I think it's better to merge this before the generic dispatch mechanism because it updates the tests.

Copy link
Contributor

@mzweilin mzweilin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM

@dxoigmn dxoigmn merged commit 20d2078 into main Apr 6, 2023
@dxoigmn dxoigmn deleted the make_gradient_modifier_like_clip_grad_norm branch April 6, 2023 14:42
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants