Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

HIP-1056 Block File Reader #10072

Open
wants to merge 10 commits into
base: main
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from 7 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 2 additions & 1 deletion docs/design/block-streams.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,8 @@ package com.hedera.mirror.common.domain.transaction;
public record BlockItem(Transaction transaction,
TransactionResult transactionResult,
List<TransactionOutput> transactionOutput, // Note: List may be empty
Optional<StateChanges> stateChanges) implements StreamItem {}
List<StateChanges> stateChanges // Note: List may be empty
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

two reasons to make the change

  • it's unclear from the doc if there should be strictly at most 1 statechanges for a transaction or not
  • there are always non-transactional state changes before block proof, in case the last event in the last round has an event transaction, there will be more than 1 statechanges; it's unnecessary for the reader to sort out which statechanges belong to a transaction, and which belong to the event transaction.

) implements StreamItem {}
```

#### BlockFile
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,10 +18,11 @@

import com.hedera.hapi.block.stream.output.protoc.BlockHeader;
import com.hedera.hapi.block.stream.protoc.BlockProof;
import com.hedera.hapi.block.stream.protoc.RecordFileItem;
import com.hedera.mirror.common.domain.DigestAlgorithm;
import com.hedera.mirror.common.domain.StreamFile;
import com.hedera.mirror.common.domain.StreamType;
import com.hederahashgraph.api.proto.java.BlockStreamInfo;
import java.util.ArrayList;
import java.util.Collection;
import java.util.List;
import lombok.AllArgsConstructor;
Expand All @@ -43,9 +44,6 @@ public class BlockFile implements StreamFile<BlockItem> {
// Used to generate block hash
private BlockProof blockProof;

// Contained within the last StateChange of the block, contains hashes needed to generate the block hash
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we have the following two hash leaves

  • previous hash from BlockHeader
  • start of block state hash from BlockProof

The other two are calculated from input block items merkle tree and output block item merkle tree, so we don't need BlockStreamInfo

private BlockStreamInfo blockStreamInfo;

@ToString.Exclude
private byte[] bytes;

Expand Down Expand Up @@ -76,6 +74,8 @@ public class BlockFile implements StreamFile<BlockItem> {
@ToString.Exclude
private String previousHash;

private RecordFileItem recordFileItem;

private Long roundEnd;

private Long roundStart;
Expand Down Expand Up @@ -103,4 +103,51 @@ public Long getIndex() {
public StreamType getType() {
return StreamType.BLOCK;
}

public static BlockFileBuilder builder() {
return new BlockFileBuilder() {
@Override
public BlockFile build() {
prebuild();
return super.build();
}
};
}

public static class BlockFileBuilder {

public BlockFileBuilder addItem(BlockItem blockItem) {
if (this.items$value == null) {
items$set = true;
items$value = new ArrayList<>();
}

items$value.add(blockItem);
return this;
}

public BlockFileBuilder onNewRound(long roundNumber) {
if (roundStart == null) {
roundStart = roundNumber;
}

roundEnd = roundNumber;
return this;
}

public BlockFileBuilder onNewTransaction(long consensusTimestamp) {
if (consensusStart == null) {
consensusStart = consensusTimestamp;
}

consensusEnd = consensusTimestamp;
return this;
}

void prebuild() {
if (count == null) {
count = items$value != null ? (long) items$value.size() : 0;
}
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -22,13 +22,12 @@
import com.hedera.mirror.common.domain.StreamItem;
import com.hederahashgraph.api.proto.java.Transaction;
import java.util.List;
import java.util.Optional;
import lombok.Builder;

@Builder(toBuilder = true)
public record BlockItem(
Transaction transaction,
TransactionResult transactionResult,
List<TransactionOutput> transactionOutput,
Optional<StateChanges> stateChanges)
List<StateChanges> stateChanges)
implements StreamItem {}
Original file line number Diff line number Diff line change
@@ -0,0 +1,70 @@
/*
* Copyright (C) 2025 Hedera Hashgraph, LLC
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.hedera.mirror.common.domain.transaction;

import static org.assertj.core.api.Assertions.assertThat;

import java.util.List;
import org.junit.jupiter.api.Test;

class BlockFileTest {

@Test
void addItem() {
var blockItem = BlockItem.builder().build();
var blockFile = BlockFile.builder().addItem(blockItem).build();
assertThat(blockFile.getItems()).containsExactly(blockItem);
}

@Test
void count() {
var blockFile = BlockFile.builder().build();
assertThat(blockFile.getCount()).isZero();

blockFile = BlockFile.builder().count(10L).build();
assertThat(blockFile.getCount()).isEqualTo(10L);

var blockItem = BlockItem.builder().build();
blockFile = BlockFile.builder().items(List.of(blockItem)).build();
assertThat(blockFile.getCount()).isEqualTo(1L);

blockFile = BlockFile.builder().addItem(blockItem).build();
assertThat(blockFile.getCount()).isEqualTo(1L);

blockFile = BlockFile.builder().addItem(blockItem).count(5L).build();
assertThat(blockFile.getCount()).isEqualTo(5L);
}

@Test
void onNewRound() {
var blockFile = BlockFile.builder().onNewRound(1L).build();
assertThat(blockFile).returns(1L, BlockFile::getRoundStart).returns(1L, BlockFile::getRoundEnd);

blockFile = BlockFile.builder().onNewRound(1L).onNewRound(2L).build();
assertThat(blockFile).returns(1L, BlockFile::getRoundStart).returns(2L, BlockFile::getRoundEnd);
}

@Test
void onNewTransaction() {
var blockFile = BlockFile.builder().onNewTransaction(1).build();
assertThat(blockFile).returns(1L, BlockFile::getConsensusStart).returns(1L, BlockFile::getConsensusEnd);

blockFile =
BlockFile.builder().onNewTransaction(1L).onNewTransaction(2L).build();
assertThat(blockFile).returns(1L, BlockFile::getConsensusStart).returns(2L, BlockFile::getConsensusEnd);
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -108,7 +108,9 @@ private StreamFilename(String path, String filename, String pathSeparator) {

// A compressed and uncompressed file can exist simultaneously, so we need uniqueness to not include .gz
this.filenameWithoutCompressor = isCompressed() ? removeExtension(this.filename) : this.filename;
this.instant = extractInstant(filename, this.fullExtension, this.sidecarId, this.streamType.getSuffix());
this.instant = streamType != StreamType.BLOCK
? extractInstant(filename, this.fullExtension, this.sidecarId, this.streamType.getSuffix())
: null;

var builder = new StringBuilder();
if (!StringUtils.isEmpty(this.path)) {
Expand Down Expand Up @@ -156,6 +158,14 @@ public static String getFilename(StreamType streamType, FileType fileType, Insta
return StringUtils.joinWith(".", StringUtils.join(timestamp, suffix), extension);
}

public Instant getInstant() {
if (streamType == StreamType.BLOCK) {
throw new IllegalStateException("BLOCK stream file doesn't have instant in its filename");
}

return instant;
}

@SuppressWarnings("java:S3776")
private static TypeInfo extractTypeInfo(String filename) {
List<String> parts = FILENAME_SPLITTER.splitToList(filename);
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,23 @@
/*
* Copyright (C) 2025 Hedera Hashgraph, LLC
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.hedera.mirror.importer.reader.block;

import com.hedera.mirror.common.domain.transaction.BlockFile;
import com.hedera.mirror.common.domain.transaction.BlockItem;
import com.hedera.mirror.importer.reader.StreamFileReader;

public interface BlockFileReader extends StreamFileReader<BlockFile, BlockItem> {}
Original file line number Diff line number Diff line change
@@ -0,0 +1,140 @@
/*
* Copyright (C) 2025 Hedera Hashgraph, LLC
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* http://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package com.hedera.mirror.importer.reader.block;

import static com.hedera.mirror.common.domain.DigestAlgorithm.SHA_384;

import com.hedera.hapi.block.stream.protoc.BlockItem;
import com.hedera.mirror.common.util.DomainUtils;
import com.hedera.mirror.importer.exception.StreamFileReaderException;
import java.security.MessageDigest;
import java.security.NoSuchAlgorithmException;
import java.util.ArrayList;
import java.util.List;
import java.util.Objects;
import lombok.NoArgsConstructor;
import lombok.SneakyThrows;
import lombok.Value;
import lombok.experimental.NonFinal;

/**
* Calculates a block's root hash per the algorithm defined in HIP-1056. Note both the input merkle tree and the output
* merkle tree are padded with SHA2-384 hash of an empty bytearray to be perfect binary trees.
*/
@NoArgsConstructor
@Value
class BlockRootHashDigest {

private static final byte[] EMPTY_HASH = createMessageDigest().digest(new byte[0]);

@NonFinal
private boolean finalized;

private List<byte[]> inputHashes = new ArrayList<>();

private List<byte[]> outputHashes = new ArrayList<>();

@NonFinal
private byte[] previousHash;

@NonFinal
private byte[] startOfBlockStateHash;

public void addInputBlockItem(BlockItem blockItem) {
inputHashes.add(createMessageDigest().digest(blockItem.toByteArray()));
}

public void addOutputBlockItem(BlockItem blockItem) {
outputHashes.add(createMessageDigest().digest(blockItem.toByteArray()));
}

public String digest() {
if (finalized) {
throw new IllegalStateException("Block root hash is already calculated");
}

validateHash(previousHash, "previousHash");
validateHash(startOfBlockStateHash, "startOfBlockStateHash");

List<byte[]> leaves = new ArrayList<>();
leaves.add(previousHash);
leaves.add(getRootHash(inputHashes));
leaves.add(getRootHash(outputHashes));
leaves.add(startOfBlockStateHash);

byte[] rootHash = getRootHash(leaves);
finalized = true;

return DomainUtils.bytesToHex(rootHash);
}

public void setPreviousHash(byte[] previousHash) {
validateHash(previousHash, "previousHash");
this.previousHash = previousHash;
}

public void setStartOfBlockStateHash(byte[] startOfBlockStateHash) {
validateHash(startOfBlockStateHash, "startOfBlockStateHash");
this.startOfBlockStateHash = startOfBlockStateHash;
}

@SneakyThrows

Check warning on line 95 in hedera-mirror-importer/src/main/java/com/hedera/mirror/importer/reader/block/BlockRootHashDigest.java

View check run for this annotation

Codecov / codecov/patch

hedera-mirror-importer/src/main/java/com/hedera/mirror/importer/reader/block/BlockRootHashDigest.java#L95

Added line #L95 was not covered by tests
xin-hedera marked this conversation as resolved.
Show resolved Hide resolved
private static MessageDigest createMessageDigest() {
try {
return MessageDigest.getInstance(SHA_384.getName());
} catch (NoSuchAlgorithmException ex) {
throw new StreamFileReaderException(ex);

Check warning on line 100 in hedera-mirror-importer/src/main/java/com/hedera/mirror/importer/reader/block/BlockRootHashDigest.java

View check run for this annotation

Codecov / codecov/patch

hedera-mirror-importer/src/main/java/com/hedera/mirror/importer/reader/block/BlockRootHashDigest.java#L99-L100

Added lines #L99 - L100 were not covered by tests
}
}

private static byte[] getRootHash(List<byte[]> leaves) {
if (leaves.isEmpty()) {
return EMPTY_HASH;
}

// Pad leaves with EMPTY_HASH to the next 2^n to form a perfect binary tree
int size = leaves.size();
if ((size & (size - 1)) != 0) {
size = Integer.highestOneBit(size) << 1;
while (leaves.size() < size) {
leaves.add(EMPTY_HASH);
}
}

// Iteratively calculate the parent node hash as h(left | right) to get the root hash in bottom-up fashion
while (size > 1) {
for (int i = 0; i < size; i += 2) {
var digest = createMessageDigest();
byte[] left = leaves.get(i);
byte[] right = leaves.get(i + 1);
digest.update(left);
digest.update(right);
leaves.set(i >> 1, digest.digest());
}

size >>= 1;
}

return leaves.getFirst();
}

private static void validateHash(byte[] hash, String name) {
if (Objects.requireNonNull(hash, "Null " + name).length != SHA_384.getSize()) {
throw new IllegalArgumentException(String.format("%s is not %d bytes", name, SHA_384.getSize()));
}
}
}
Loading
Loading