Conversation
merkle/coniks/coniks.go
Outdated
| @@ -0,0 +1,89 @@ | |||
| // Copyright 2016 Google Inc. All Rights Reserved. | |||
merkle/coniks/coniks.go
Outdated
| // Default is the standard CONIKS hasher. | ||
| var Default = New(crypto.SHA512_256) | ||
|
|
||
| // hasher implements the sparse merkel tree hashing algorithm specified in the CONIKS paper. |
There was a problem hiding this comment.
Same typo as before. Merkle. Please fix where you cut and pasted it from too.
merkle/coniks/coniks.go
Outdated
| h := m.New() | ||
| h.Write(emptyIdentifier) | ||
| binary.Write(h, binary.BigEndian, uint64(treeID)) | ||
| h.Write(index) // TODO block out the bits that are not part of the index. |
There was a problem hiding this comment.
What are the implications of this TODO. Does it work properly as is?
There was a problem hiding this comment.
The result is that the hashes would be incorrect. Thanks for poking at this.
I've implemented and tested the proper masking function now.
| // HashEmpty returns the hash of an empty branch at a given depth. | ||
| // A depth of 0 indicates the hash of an empty leaf. | ||
| // Empty branches within the tree are plain interior nodes e1 = H(e0, e0) etc. | ||
| func (m *hasher) HashEmpty(treeID int64, index []byte, height int) []byte { |
There was a problem hiding this comment.
The comment says 'depth' but the parameter is 'height'. This will cause confusion. If it really is height can you fix it in the interface definition and the other implementation. We've tried to use 'depth' everywhere because of previous confusion.
There was a problem hiding this comment.
After merging this PR, I wouldn't be opposed to changing the tree computation algorithms to all use depth. At the moment though, the algorithms themselves use height and it's a little simpler to hide the height to depth conversion inside this function.
merkle/coniks/coniks.go
Outdated
| return h.Sum(nil) | ||
| } | ||
|
|
||
| // HashChildren returns the inner Merkle tree node hash of the the two child nodes l and r. |
There was a problem hiding this comment.
replace 'inner' with 'internal'.
merkle/coniks/coniks_test.go
Outdated
| @@ -0,0 +1,47 @@ | |||
| // Copyright 2016 Google Inc. All Rights Reserved. | |||
| "github.com/google/trillian/testonly" | ||
| ) | ||
|
|
||
| const treeID = int64(0) |
There was a problem hiding this comment.
I'd suggest something other than zero for the ID. I'd also test that different treeID values result in different hashes.
There was a problem hiding this comment.
I'm not exactly sure what to do here. The maphasher - which matches the Python / C++ merkle tree code - does not use TreeID and result in different hashes.
We could
- Change the maphasher to include TreeID and generate new test vectors.
- Keep the existing ones and add a new test with CONIKS test vectors. But do we really want to make the merkle package dependent on all the sub hashing algorithms?
merkle/map_verifier.go
Outdated
| } | ||
| for i, element := range proof { | ||
| if got, wanta, wantb := len(element), 0, h.Size(); got != wanta && got != wantb { | ||
| return fmt.Errorf("invalid proof: len(proof[%v]) %d, want %d or %d", i, got, wanta, wantb) |
There was a problem hiding this comment.
Should use "got:, want:" formatting in Errorf.
| "github.com/google/trillian/merkle/rfc6962" | ||
| ) | ||
|
|
||
| const treeID = int64(0) |
There was a problem hiding this comment.
Again, zero is probably not the best choice.
There was a problem hiding this comment.
The subhashers that the test uses don't use treeID
There was a problem hiding this comment.
OK then as long as hashers that do use it have their own tests.
| var defaultLogStrata = []int{8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 8} | ||
| var defaultMapStrata = []int{8, 8, 8, 8, 8, 8, 8, 8, 8, 8, 176} | ||
|
|
||
| const treeID = int64(0) |
|
OK. We can leave maphasher as is. The coniks one should test that the
treeID is being correctly handled.
…On 21 June 2017 at 13:39, Gary Belvin ***@***.***> wrote:
***@***.**** commented on this pull request.
------------------------------
In merkle/hstar2_test.go
<#691 (comment)>:
> @@ -25,6 +24,8 @@ import (
"github.com/google/trillian/testonly"
)
+const treeID = int64(0)
I'm not exactly sure what to do here. The maphasher - which matches the
Python / C++ merkle tree code - does not use TreeID and result in different
hashes.
We could
- Change the maphasher to include TreeID and generate new test vectors.
- Keep the existing ones and add a new test with CONIKS test vectors.
But do we really want to make the merkle package dependent on all the sub
hashing algorithms?
—
You are receiving this because your review was requested.
Reply to this email directly, view it on GitHub
<#691 (comment)>, or mute
the thread
<https://github.com/notifications/unsubscribe-auth/AMv2TzPQ_tGkiCpPSr6fvrUTuX5YiqY7ks5sGQ8dgaJpZM4N-AQX>
.
|
|
I've implemented |
fc2a412 to
5b89ec0
Compare
|
Rebased now that #694 has been merged. PTAL |
00dbf80 to
e67184b
Compare
|
Do you want to hash the |
| } | ||
|
|
||
| // maskIndex returns index with only the left depth bits set. | ||
| // index must be of size m.Size() and 0 <= depth <= m.BitLen(). |
There was a problem hiding this comment.
This could probably be an array. I think including an example in the comment would help here. Maybe also mention that it's only used for the last byte to explain the 0xff in position 0.
There was a problem hiding this comment.
I would agree, but different hashers can have different sizes for their indexes, which makes using an array here difficult.
There was a problem hiding this comment.
Not sure what you mean. It's only used by maskIndex and it's indexed by byte position from 0-7.
There was a problem hiding this comment.
ah, yes. -- referring to leftmask?
There was a problem hiding this comment.
Yes, leftmask, which should probably also be leftMask.
merkle/coniks/coniks_test.go
Outdated
| {index: h2b("FF00000000000000000000000000000000000000"), depth: 5, want: h2b("F800000000000000000000000000000000000000")}, | ||
| {index: h2b("FF00000000000000000000000000000000000000"), depth: 6, want: h2b("FC00000000000000000000000000000000000000")}, | ||
| {index: h2b("FF00000000000000000000000000000000000000"), depth: 7, want: h2b("FE00000000000000000000000000000000000000")}, | ||
| {index: h2b("FF00000000000000000000000000000000000000"), depth: 8, want: h2b("FF00000000000000000000000000000000000000")}, |
There was a problem hiding this comment.
Can you add a couple of test cases that are less uniform and show the mask being applied across byte boundaries etc.
There was a problem hiding this comment.
Not sure if github is showing me all the latest changes but I'd like to see a depth above 8.
| // e.g. 1 -> 0000000000000000000000000000000000000001 | ||
| func PaddedBytes(i *big.Int, size int) []byte { | ||
| b := i.Bytes() | ||
| ret := make([]byte, size) |
There was a problem hiding this comment.
Does this work if there is no padding needed?
There was a problem hiding this comment.
Added at test for this.
merkle/hstar2_test.go
Outdated
| i int64 | ||
| want []byte | ||
| }{ | ||
| {i: 1, want: h2b("0000000000000000000000000000000000000001")}, |
There was a problem hiding this comment.
This would be a good place to add the test for zero padding + possibly others.
| "github.com/google/trillian/merkle/rfc6962" | ||
| ) | ||
|
|
||
| const treeID = int64(0) |
There was a problem hiding this comment.
OK then as long as hashers that do use it have their own tests.
| @@ -29,6 +29,7 @@ import ( | |||
| "github.com/google/trillian/crypto/keys" | |||
| "github.com/google/trillian/extension" | |||
| _ "github.com/google/trillian/merkle/objhasher" // Load hashers | |||
There was a problem hiding this comment.
I don't think you need the comments on these imports + in other files. The '_' implies it's being imported for side effects.
There was a problem hiding this comment.
go vet sometimes complains if there isn't a comment on underscore imports.
Add treeID, index, and depth to HashLeaf and HashEmtpy. Fuffills the security requirements in google#670.
|
Added more tests. PTAL |
Add treeID, index, and depth to HashLeaf and HashEmtpy.
The fulfills the security requirements and closes #670.
Adds the CONIKS hasher to the set of map hashers.
The next PR will make it available through the command line.