Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Calculate Trampoline onion packet sizes dynamically. #3333

Open
wants to merge 4 commits into
base: main
Choose a base branch
from

Conversation

arik-so
Copy link
Contributor

@arik-so arik-so commented Sep 24, 2024

Given that Trampoline onion sizes will be variable, this obviates the need to pass the onion size a priori.

Comment on lines 423 to 430
let packet_length: usize = payloads
.iter()
.map(|p| {
let mut payload_len = LengthCalculatingWriter(0);
p.write(&mut payload_len).expect("Failed to calculate length");
payload_len.0 + 32
})
.sum();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it harmful for privacy to not use a fixed length?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes. But we can always pad after the fact, right? Or we can make this an option.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Given that the recipient has to read the entire onion anyway, and therefore its potential size is limited by how many outer onion hops came before, and that there are not usually many Trampoline hops, there is a potential limitation of privacy even so.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Would the code be more confusing if we padded after the fact? I'd kinda prefer to review this commit amidst some more context.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hm, I do think padding after the fact would be more complicated. The context is mostly integration tests against ACINQ, where the current version does not appear to be padding the onion to a fixed size, which we should actually bring up to Bastien. It didn't occur to me yesterday.

Copy link

codecov bot commented Sep 24, 2024

Codecov Report

Attention: Patch coverage is 0% with 20 lines in your changes missing coverage. Please review.

Project coverage is 89.60%. Comparing base (66fb520) to head (b4da6fb).
Report is 12 commits behind head on main.

Files with missing lines Patch % Lines
lightning/src/ln/onion_utils.rs 0.00% 20 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##             main    #3333      +/-   ##
==========================================
- Coverage   89.66%   89.60%   -0.06%     
==========================================
  Files         126      126              
  Lines      102676   102766      +90     
  Branches   102676   102766      +90     
==========================================
+ Hits        92062    92088      +26     
- Misses       7894     7943      +49     
- Partials     2720     2735      +15     
Flag Coverage Δ
89.60% <0.00%> (-0.03%) ⬇️

Flags with carried forward coverage won't be shown. Click here to find out more.

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@arik-so arik-so force-pushed the dynamic_trampoline_onion_sizes branch 2 times, most recently from 7f89095 to b4da6fb Compare September 25, 2024 01:53
Copy link
Collaborator

@TheBlueMatt TheBlueMatt left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Feel free to squash when you next push.

Comment on lines +426 to +427
let mut payload_len = LengthCalculatingWriter(0);
p.write(&mut payload_len).expect("Failed to calculate length");
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
let mut payload_len = LengthCalculatingWriter(0);
p.write(&mut payload_len).expect("Failed to calculate length");
let mut payload_len = p.serialized_length();

Comment on lines 428 to 431
payload_len.0.checked_add(32).expect("Excessive payload size")
})
.try_fold(0usize, |a, b| a.checked_add(b))
.expect("Excessive onion length");
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why check + expect rather than just adding? I don't think we're at any real risk of a u64 overflowing here.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

frankly, I've been thinking the same thing. Currently it's in a bit of a limbo state, because even with the checked addition, it's still expecting. So either I should surface it to an Error struct, or allow it to throw immediately. Though you're also right, it's unlikely to happen outside of a fuzz scenario.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm not sure its possible anyway? We'd have to have a huge custom TLV and we should reject that long before we get to this point (or just blame the user for trying to stuff 4GB of garbage down an onion).

let mut chacha = ChaCha20::new(&prng_seed, &[0; 8]);
chacha.process(&vec![0u8; length as usize], &mut packet_data);
chacha.process(&vec![0u8; packet_length], &mut packet_data);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

While you're here mind swapping for process_in_place?

@arik-so arik-so force-pushed the dynamic_trampoline_onion_sizes branch from b4da6fb to f12051f Compare October 27, 2024 07:50
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants