Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

New reporting features #118

Merged
merged 83 commits into from
Dec 4, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
83 commits
Select commit Hold shift + click to select a range
a96a4dc
Added cp constraints for undisturbed bits
SimoPez Sep 25, 2023
2da54d4
Test output fixed
SimoPez Sep 26, 2023
389654a
Resolved method inputs issues
SimoPez Oct 10, 2023
5838f30
Merge branch 'develop' into feat/cp-undisturbed-bits-constraints
SimoPez Oct 10, 2023
d10dfe0
Resolved tests error
SimoPez Oct 10, 2023
875ae98
Created impossible differential model
SimoPez Oct 15, 2023
50a0ac1
Tested impossible trails search, improved parser for truncated and im…
SimoPez Oct 16, 2023
5d3e2b4
Solver uoutput parser adapted to truncated
SimoPez Oct 24, 2023
5434ca4
Merge branch 'develop' into feat/cp-undisturbed-bits-constraints
SimoPez Nov 1, 2023
8c09224
Integrated new cipher inverse method
SimoPez Nov 10, 2023
2b988d1
Output parsing improved
SimoPez Nov 10, 2023
3921874
Merge branch 'develop' into feat/cp-undisturbed-bits-constraints
SimoPez Nov 10, 2023
17cd452
Fixed component error
SimoPez Nov 14, 2023
de1f087
Implemented pytest for cp impossible differential
SimoPez Nov 17, 2023
d97d8d3
Tests fixed
SimoPez Nov 17, 2023
fd36f86
Fixed bugs
SimoPez Nov 17, 2023
df84786
inverse_cipher aligned
SimoPez Nov 23, 2023
fd66566
Covered new code
SimoPez Nov 24, 2023
2dbbb43
Minor fixes
SimoPez Nov 24, 2023
e24c5eb
Minor fixes
SimoPez Nov 24, 2023
69925df
FEATURE/Feat: Window heuristic per modular addition
juaninf Nov 25, 2023
937a88f
FIX/Fix: Fix versioning
AnaCaceres Nov 27, 2023
8087a43
FEATURE/Add: implement scarf block cipher
SiMohamedRachidi Nov 27, 2023
90e7b5e
Update changelog manually
AnaCaceres Nov 27, 2023
161c04a
Update version manually
AnaCaceres Nov 27, 2023
6f26ea9
add test for scarf block cipher
SiMohamedRachidi Nov 27, 2023
5d20ee2
Merge branch 'develop' into feat/scarf_block_cipher
SiMohamedRachidi Nov 27, 2023
e564a27
FEATURE/Add: external solver support for MILP truncated/impossible mo…
p-huynh Nov 27, 2023
72457fe
Merge branch 'develop' into feat/extend_external_solvers_support_for_…
p-huynh Nov 27, 2023
269b9c3
Merge branch 'develop' into feat/extend_external_solvers_support_for_…
p-huynh Nov 27, 2023
2888aad
Merge remote-tracking branch 'origin/feat/extend_external_solvers_sup…
p-huynh Nov 27, 2023
19d4eaa
add test vectors
SiMohamedRachidi Nov 28, 2023
ef4f559
Merge pull request #108 from Crypto-TII/feat/cp-undisturbed-bits-cons…
peacker Nov 28, 2023
d0fa891
Merge pull request #113 from Crypto-TII/refactoring/window_size_on_mo…
peacker Nov 28, 2023
a8196fe
Merge pull request #115 from Crypto-TII/feat/extend_external_solvers_…
peacker Nov 28, 2023
17b823a
Merge pull request #114 from Crypto-TII/fix/update-changelog
peacker Nov 28, 2023
9956fe2
move scarf_block_cipher_test.py to the correct folder
SiMohamedRachidi Nov 28, 2023
42b29aa
Merge pull request #116 from Crypto-TII/feat/scarf_block_cipher
peacker Nov 29, 2023
4a01c21
updated avalanche tests, neural network tests and continuous tests ou…
MFormenti Oct 5, 2023
e8834ad
Updated branch to solve conflicts with develop
MFormenti Nov 30, 2023
c12e9b0
Added Report class
MFormenti Nov 6, 2023
55f719c
Added Report class and updated test files to include test_name in the…
MFormenti Nov 7, 2023
8717a76
Updated branch to solve conflicts with develop
MFormenti Nov 30, 2023
1297dc1
Updated Report class:
MFormenti Nov 23, 2023
e33bff6
Updated Makefile to fix pytests
MFormenti Nov 23, 2023
19ca45d
Updated branch to solve conflicts with develop
MFormenti Nov 30, 2023
5d887e3
Updated cipher.py, milp_xor_linear_model and utils_test.py
MFormenti Nov 23, 2023
12ebee7
Fixed cipher_test.py and Dockerfile
MFormenti Nov 23, 2023
5973aad
Fixed Dockerfile
MFormenti Nov 23, 2023
984762e
BREAKING/Add:Create report class
MFormenti Nov 23, 2023
fbea8e6
Fixed cipher_test.py
MFormenti Nov 24, 2023
dcf1266
Fixed Dockerfile
MFormenti Nov 24, 2023
52866c8
Fixed DeprecationWarning in report.py
MFormenti Nov 28, 2023
357fc3f
Fixed report.py for statistical tests
MFormenti Nov 29, 2023
5d38055
Updated report_test.py
MFormenti Nov 29, 2023
6b2241d
Updated report.py
MFormenti Nov 29, 2023
1d47948
Updated branch to solve conflicts with develop
MFormenti Nov 30, 2023
33c0606
Updated branch to solve conflicts with develop
MFormenti Nov 30, 2023
e8efa17
Updated branch to solve conflicts with develop
MFormenti Nov 30, 2023
5a3b950
Updated branch to solve conflicts with develop
MFormenti Nov 30, 2023
f7f3c58
Updated branch to solve conflicts with develop
MFormenti Nov 30, 2023
8848d96
rebased branch for merge with develop
MFormenti Nov 30, 2023
35fc10a
rebased branch for merge with develop
MFormenti Nov 30, 2023
e126a97
rebased branch for merge with develop
MFormenti Nov 30, 2023
7d280f1
rebased branch for merge with develop
MFormenti Nov 30, 2023
b17f9af
rebased branch for merge with develop
MFormenti Nov 30, 2023
9e56606
rebased branch for merge with develop
MFormenti Nov 30, 2023
8c1c229
rebased branch for merge with develop
MFormenti Nov 30, 2023
6c0e1da
Updated Makefile to fix pytests
MFormenti Nov 23, 2023
10307c7
rebased branch for merge with develop
MFormenti Nov 30, 2023
5905093
rebased branch for merge with develop
MFormenti Nov 30, 2023
2d95076
Fixed Dockerfile
MFormenti Nov 23, 2023
6cd2d80
Fixed Dockerfile
MFormenti Nov 24, 2023
fb9e9f0
rebased branch for merge with develop
MFormenti Nov 30, 2023
3313c59
rebased branch for merge with develop
MFormenti Nov 30, 2023
3551e6a
rebased branch for merge with develop
MFormenti Nov 30, 2023
91d67a4
rebased branch for merge with develop
MFormenti Nov 30, 2023
52536ba
rebased branch for merge with develop
MFormenti Dec 1, 2023
2cd2049
rebased branch for merge with develop
MFormenti Dec 1, 2023
951c1eb
rebased branch for merge with develop
MFormenti Dec 1, 2023
4177ea6
rebased branch for merge with develop
MFormenti Dec 1, 2023
21a0efb
fixed commented function in utils_test.py
MFormenti Dec 1, 2023
5477059
Merge pull request #117 from Crypto-TII/feat/test_function_output_uni…
peacker Dec 1, 2023
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
5 changes: 3 additions & 2 deletions claasp/cipher_modules/algebraic_tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,7 +20,7 @@
from claasp.cipher_modules.models.algebraic.algebraic_model import AlgebraicModel


def algebraic_tests(cipher, timeout):
def algebraic_tests(cipher, timeout=60):
from sage.structure.sequence import Sequence
nvars_up_to_round = []

Expand Down Expand Up @@ -56,7 +56,8 @@ def algebraic_tests(cipher, timeout):
tests_up_to_round.append(result)

input_parameters = {
"timeout": timeout
"timeout": timeout,
"test_name": "algebraic_tests"
}

test_results = {
Expand Down
91 changes: 45 additions & 46 deletions claasp/cipher_modules/avalanche_tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -36,16 +36,15 @@ def avalanche_tests(cipher, number_of_samples=5, avalanche_dependence_uniform_bi
criterion = compute_criterion_from_avalanche_probability_vectors(cipher, all_avalanche_probability_vectors,
avalanche_dependence_uniform_bias)
intermediate_output_names = add_intermediate_output_components_id_to_dictionary(cipher.get_all_components())
diffusion_tests = {
"input_parameters": {
"number_of_samples": number_of_samples,
"avalanche_dependence_uniform_bias": avalanche_dependence_uniform_bias,
"avalanche_dependence_criterion_threshold": avalanche_dependence_criterion_threshold,
"avalanche_dependence_uniform_criterion_threshold": avalanche_dependence_uniform_criterion_threshold,
"avalanche_weight_criterion_threshold": avalanche_weight_criterion_threshold,
"avalanche_entropy_criterion_threshold": avalanche_entropy_criterion_threshold}}

test_results = init_dictionary_test_results(cipher, intermediate_output_names)
diffusion_tests = {"input_parameters": {
"test_name": "avalanche_tests",
"number_of_samples": number_of_samples,
"avalanche_dependence_uniform_bias": avalanche_dependence_uniform_bias,
"avalanche_dependence_criterion_threshold": avalanche_dependence_criterion_threshold,
"avalanche_dependence_uniform_criterion_threshold": avalanche_dependence_uniform_criterion_threshold,
"avalanche_weight_criterion_threshold": avalanche_weight_criterion_threshold,
"avalanche_entropy_criterion_threshold": avalanche_entropy_criterion_threshold},
"test_results": init_dictionary_test_results(cipher, intermediate_output_names)}

parameters = {
"avalanche_dependence_vectors": [run_avalanche_dependence, 1,
Expand All @@ -60,16 +59,15 @@ def avalanche_tests(cipher, number_of_samples=5, avalanche_dependence_uniform_bi
for intermediate_output_name in list(intermediate_output_names.keys()):
if parameters[criterion_name][0]:
add_intermediate_output_values_to_dictionary(cipher, criterion_name, intermediate_output_names,
parameters, test_results, index, input_name,
parameters,diffusion_tests, index, input_name,
intermediate_output_name)
all_output_vectors, largest_round_criterion_not_satisfied = \
calculate_regular_difference(criterion_name, criterion, intermediate_output_names, parameters,
test_results, input_name, intermediate_output_name)
calculate_average_difference(all_output_vectors, criterion_name, parameters, test_results,
diffusion_tests, input_name, intermediate_output_name)
calculate_average_difference(all_output_vectors, criterion_name, parameters, diffusion_tests,
input_name, intermediate_output_name)
calculate_worst_input_differences(cipher, criterion_name, largest_round_criterion_not_satisfied,
test_results, input_name, intermediate_output_name)
diffusion_tests["test_results"] = test_results
#calculate_worst_input_differences(cipher, criterion_name, largest_round_criterion_not_satisfied,
# diffusion_tests, input_name, intermediate_output_name)

return diffusion_tests

Expand Down Expand Up @@ -109,21 +107,21 @@ def add_intermediate_output_components_id_to_dictionary(components):
def add_intermediate_output_values_to_dictionary(cipher, criterion_name, dict_intermediate_output_names,
dict_parameters, dict_test_results, index,
input_name, intermediate_output_name):
dict_test_results[input_name][intermediate_output_name][criterion_name] = {}
dict_test_results[input_name][intermediate_output_name][criterion_name]["input_bit_size"] = \
dict_test_results["test_results"][input_name][intermediate_output_name][criterion_name] = {}
dict_test_results["input_parameters"][f'{intermediate_output_name}_{criterion_name}_input_bit_size'] = \
cipher.inputs_bit_size[index]
output_bit_size = dict_intermediate_output_names[intermediate_output_name][0]
dict_test_results[input_name][intermediate_output_name][criterion_name]["output_bit_size"] = output_bit_size
dict_test_results[input_name][intermediate_output_name][criterion_name]["max_possible_value_per_bit"] = 1
dict_test_results[input_name][intermediate_output_name][criterion_name]["min_possible_value_per_bit"] = 0
dict_test_results[input_name][intermediate_output_name][criterion_name]["expected_value_per_bit"] = \
dict_test_results["input_parameters"][f'{intermediate_output_name}_{criterion_name}_output_bit_size'] = output_bit_size
#dict_test_results[input_name][intermediate_output_name][criterion_name]["max_possible_value_per_bit"] = 1
#dict_test_results[input_name][intermediate_output_name][criterion_name]["min_possible_value_per_bit"] = 0
dict_test_results["input_parameters"][f'{intermediate_output_name}_{criterion_name}_expected_value_per_bit'] = \
dict_parameters[criterion_name][1]
dict_test_results[input_name][intermediate_output_name][criterion_name]["max_possible_value_per_output_block"] = \
dict_test_results["input_parameters"][f'{intermediate_output_name}_{criterion_name}_max_possible_value_per_output_block'] = \
output_bit_size
dict_test_results[input_name][intermediate_output_name][criterion_name]["min_possible_value_per_output_block"] = 0
dict_test_results[input_name][intermediate_output_name][criterion_name]["expected_value_per_output_block"] = \
dict_test_results["input_parameters"][f'{intermediate_output_name}_{criterion_name}_min_possible_value_per_output_block'] = 0
dict_test_results["input_parameters"][f'{intermediate_output_name}_{criterion_name}_expected_value_per_output_block'] = \
output_bit_size * dict_parameters[criterion_name][1]
dict_test_results[input_name][intermediate_output_name][criterion_name]["differences"] = []
dict_test_results["test_results"][input_name][intermediate_output_name][criterion_name] = []


def calculate_regular_difference(criterion_name, dict_criterion, dict_intermediate_output_names, dict_parameters,
Expand All @@ -138,8 +136,8 @@ def calculate_regular_difference(criterion_name, dict_criterion, dict_intermedia
criterion_name],
"round": dict_criterion[input_name][intermediate_output_name][index_input_diff][nb_occurence]["round"]}
tmp_dict["total"] = sum(tmp_dict["vector"])
expected_value_per_output_block = dict_test_results[input_name][intermediate_output_name][criterion_name][
"expected_value_per_output_block"]
expected_value_per_output_block = dict_test_results["input_parameters"][(f'{intermediate_output_name}_'
f'{criterion_name}_expected_value_per_output_block')]
threshold = dict_parameters[criterion_name][2]
if expected_value_per_output_block - threshold <= tmp_dict[
"total"] <= expected_value_per_output_block + threshold:
Expand All @@ -152,31 +150,33 @@ def calculate_regular_difference(criterion_name, dict_criterion, dict_intermedia
all_output_vectors[tmp_dict["round"]] = []
all_output_vectors[tmp_dict["round"]].append(tmp_dict["vector"])
output_vectors.append(tmp_dict)
dict_for_each_input_diff = {"input_difference_type": "regular",
"input_difference_value": hex(1 << index_input_diff),
"output_vectors": output_vectors}
dict_test_results[input_name][intermediate_output_name][criterion_name]["differences"].append(
dict_for_each_input_diff)

output_dict = {
"input_difference_value": hex(1 << index_input_diff),
"vectors": [vector["vector"] for vector in output_vectors],
"total": [vector["total"] for vector in output_vectors],
"satisfied_criterion": [vector["criterion_satisfied"] for vector in output_vectors],
"component_ids": [vector["output_component_id"] for vector in output_vectors]
}

dict_test_results["test_results"][input_name][intermediate_output_name][criterion_name].append(output_dict)

return all_output_vectors, dict_largest_round_criterion_not_satisfied


def calculate_average_difference(all_output_vectors, criterion_name, dict_parameters, dict_test_results, input_name,
intermediate_output_name):
dict_for_average_diff = {"input_difference_type": "average", "input_difference_value": 0}
dict_for_average_diff = {"input_difference_value": 'average'}
output_vectors = []
for current_round in all_output_vectors.keys():
tmp_dict = {}
average_vector = [
sum(vec) /
dict_test_results[input_name][intermediate_output_name][criterion_name][
"input_bit_size"] for vec in zip(
*
all_output_vectors[current_round])]
sum(vec) /
dict_test_results["input_parameters"][(f'{intermediate_output_name}'
f'_{criterion_name}_input_bit_size')] for vec in zip(* all_output_vectors[current_round])]
tmp_dict["vector"] = average_vector
tmp_dict["total"] = sum(tmp_dict["vector"])
expected_value_per_output_block = dict_test_results[input_name][
intermediate_output_name][criterion_name]["expected_value_per_output_block"]
expected_value_per_output_block = dict_test_results["input_parameters"][f'{intermediate_output_name}_{criterion_name}_expected_value_per_output_block']
threshold = dict_parameters[criterion_name][2]
if expected_value_per_output_block - \
threshold <= tmp_dict["total"] <= expected_value_per_output_block + threshold:
Expand All @@ -185,10 +185,9 @@ def calculate_average_difference(all_output_vectors, criterion_name, dict_parame
tmp_dict["criterion_satisfied"] = False
tmp_dict["round"] = current_round
tmp_dict["output_component_id"] = "None"
output_vectors.append(tmp_dict)
dict_for_average_diff["output_vectors"] = output_vectors
dict_test_results[input_name][intermediate_output_name][criterion_name]["differences"].append(
dict_for_average_diff)
output_vectors.append(tmp_dict["vector"])
dict_for_average_diff["vectors"] = output_vectors
dict_test_results["test_results"][input_name][intermediate_output_name][criterion_name].append(dict_for_average_diff)


def calculate_worst_input_differences(cipher, criterion_name, largest_round_criterion_not_satisfied,
Expand All @@ -198,7 +197,7 @@ def calculate_worst_input_differences(cipher, criterion_name, largest_round_crit
worst_input_diffs = [input_diff for input_diff, specific_round in
largest_round_criterion_not_satisfied.items()
if specific_round == max_round_criterion_not_satisfied]
dict_test_results[input_name][intermediate_output_name][criterion_name]["worst_differences"] = worst_input_diffs
dict_test_results["test_results"][input_name][intermediate_output_name][criterion_name].append(worst_input_diffs)


def avalanche_probability_vectors(cipher, nb_samples):
Expand Down
7 changes: 7 additions & 0 deletions claasp/cipher_modules/code_generator.py
Original file line number Diff line number Diff line change
Expand Up @@ -20,6 +20,7 @@
import os
import math
import inspect
import time
from subprocess import call

import claasp
Expand Down Expand Up @@ -247,6 +248,7 @@ def generate_bit_based_vectorized_python_code_string(cipher, store_intermediate_

code.extend([f' {cipher.inputs[i]}=input[{i}]' for i in range(len(cipher.inputs))])
for component in cipher.get_all_components():
start = time.time()
params = prepare_input_bit_based_vectorized_python_code_string(component)
component_types_allowed = ['constant', 'linear_layer', 'concatenate', 'mix_column',
'sbox', 'cipher_output', 'intermediate_output', 'fsr']
Expand All @@ -258,6 +260,8 @@ def generate_bit_based_vectorized_python_code_string(cipher, store_intermediate_
name = component.id
if verbosity and component.type != 'constant':
code.append(f' bit_vector_print_as_hex_values("{name}_output", {name})')
end=time.time()
print(f'{component.id} time = {end-start}')
if store_intermediate_outputs:
code.append(' return intermediateOutputs')
elif CIPHER_INVERSE_SUFFIX in cipher.id:
Expand Down Expand Up @@ -315,6 +319,7 @@ def generate_byte_based_vectorized_python_code_string(cipher, store_intermediate
code.append(f' {cipher.inputs[i]}=input[{i}]')
bit_sizes[cipher.inputs[i]] = cipher.inputs_bit_size[i]
for component in cipher.get_all_components():
start = time.time()
params = prepare_input_byte_based_vectorized_python_code_string(bit_sizes, component)
bit_sizes[component.id] = component.output_bit_size
component_types_allowed = ['constant', 'linear_layer', 'concatenate', 'mix_column',
Expand All @@ -330,6 +335,8 @@ def generate_byte_based_vectorized_python_code_string(cipher, store_intermediate
if verbosity and component.type != 'constant':
code.append(f' byte_vector_print_as_hex_values("{name}_input", {params})')
code.append(f' byte_vector_print_as_hex_values("{name}_output", {name})')
end=time.time()
print(f'{component.id} time = {end-start}')
if store_intermediate_outputs:
code.append(' return intermediateOutputs')
elif CIPHER_INVERSE_SUFFIX in cipher.id:
Expand Down
2 changes: 1 addition & 1 deletion claasp/cipher_modules/component_analysis_tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -1050,4 +1050,4 @@ def remove_components_with_strings_as_values(results_without_xor):
str_in_list.append(isinstance(results_without_xor[i]["properties"][result_property]["value"], str))
if True not in str_in_list:
results.append(results_without_xor[i])
return results
return results
39 changes: 34 additions & 5 deletions claasp/cipher_modules/continuous_tests.py
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,8 @@ def _create_list_fixing_some_inputs(tag_input):
values.append(value_object)

continuous_diffusion_tests[tag_input][tag_output]["continuous_avalanche_factor"] = {}
continuous_diffusion_tests[tag_input][tag_output]["continuous_avalanche_factor"]["values"] = values
value_list = [X['value'] for X in values]
continuous_diffusion_tests[tag_input][tag_output]["continuous_avalanche_factor"]["values"] = value_list

return continuous_diffusion_tests

Expand Down Expand Up @@ -291,6 +292,12 @@ def continuous_diffusion_factor(cipher, beta_number_of_samples, gf_number_sample
output_tag, input_bit)
i += 1

for it in continuous_neutrality_measures.keys():
for out in continuous_neutrality_measures[it].keys():
copy_values = [list(X.values()) for X in continuous_neutrality_measures[it][out]['diffusion_factor']['values']]
copy_values = [value for round in copy_values for value in round]
continuous_neutrality_measures[it][out]['diffusion_factor']['values'] = copy_values

return continuous_neutrality_measures


Expand Down Expand Up @@ -318,7 +325,19 @@ def continuous_diffusion_tests(cipher,
is_continuous_avalanche_factor=True,
is_continuous_neutrality_measure=True,
is_diffusion_factor=True):
continuous_diffusion_tests = {}
continuous_diffusion_tests = {"input_parameters": {
'continuous_avalanche_factor_number_of_samples': continuous_avalanche_factor_number_of_samples,
'threshold_for_avalanche_factor': threshold_for_avalanche_factor,
'continuous_neutral_measure_beta_number_of_samples': continuous_neutral_measure_beta_number_of_samples,
'continuous_neutral_measure_gf_number_samples': continuous_neutral_measure_gf_number_samples,
'continuous_diffusion_factor_beta_number_of_samples': continuous_diffusion_factor_beta_number_of_samples,
'continuous_diffusion_factor_gf_number_samples': continuous_diffusion_factor_gf_number_samples,
'is_continuous_avalanche_factor': is_continuous_avalanche_factor,
'is_continuous_neutrality_measure': is_continuous_neutrality_measure,
'is_diffusion_factor': is_diffusion_factor
},
"test_results": {}}

if is_diffusion_factor:
continuous_diffusion_factor_output = continuous_diffusion_factor(
cipher,
Expand All @@ -329,17 +348,27 @@ def continuous_diffusion_tests(cipher,
cipher,
continuous_neutral_measure_beta_number_of_samples,
continuous_neutral_measure_gf_number_samples)

if is_continuous_avalanche_factor:
continuous_avalanche_factor_output = continuous_avalanche_factor(cipher,
threshold_for_avalanche_factor,
continuous_avalanche_factor_number_of_samples)

inputs_tags = list(continuous_neutrality_measure_output.keys())
output_tags = list(continuous_neutrality_measure_output[inputs_tags[0]].keys())

for it in inputs_tags:
for out in output_tags:
copy_values = [list(X.values()) for X in continuous_neutrality_measure_output[it][out]['continuous_neutrality_measure']['values']]
copy_values = [value for round in copy_values for value in round]
continuous_neutrality_measure_output[it][out]['continuous_neutrality_measure']['values'] = copy_values
continuous_neutrality_measure_output[it][out]['continuous_neutrality_measure'].pop('input_bit')
continuous_neutrality_measure_output[it][out]['continuous_neutrality_measure'].pop('output_bits')

for input_tag in inputs_tags:
continuous_diffusion_tests[input_tag] = {}
continuous_diffusion_tests["test_results"][input_tag] = {}
for output_tag in output_tags:
continuous_diffusion_tests[input_tag][output_tag] = {
continuous_diffusion_tests["test_results"][input_tag][output_tag] = {
**continuous_neutrality_measure_output[input_tag][output_tag],
**continuous_avalanche_factor_output[input_tag][output_tag],
**continuous_diffusion_factor_output[input_tag][output_tag],
Expand Down Expand Up @@ -436,4 +465,4 @@ def continuous_neutrality_measure_for_bit_j_and_beta(cipher, input_bit, beta, nu
**continuous_diffusion_tests, **continuous_avalanche_factor_by_tag_input_dict
}

return continuous_diffusion_tests
return continuous_diffusion_tests
Loading