Any distribution of the Java 17 JDK.
To run the application directly, execute...
./gradlew clean run
This will run the web API on port 8080. You can view the API documentation at /openapi
.
- Run
brew install mike-engel/jwt-cli/jwt-cli
- Run
jwt encode --exp='+5min' --jti $(uuidgen) --alg RS256 --no-iat -S @/PATH_TO_FILE_ON_YOUR_MACHINE/trusted-intermediary/mock_credentials/organization-trusted-intermediary-private-key-local.pem
- Copy token from terminal and paste into your postman body with the key
client_assertion
- Add a key to the body with the key
scope
and value oftrusted-intermediary
- Body type should be
x-wwww-form-urlencoded
- You should be able to run the post call against the
v1/auth/token
endpoint to receive a bearer token to be used in this step
The additional requirements needed to contribute towards development are...
To set up the necessary environment variables, you can use the generate_env.sh
script. This script will create a .env
file in the resource folder with the required configuration. Follow these steps:
-
Navigate to the project directory.
-
Run the
generate_env.sh
script:./generate_env.sh
To compile the application, execute...
./gradlew shadowJar
Once compiled, the built artifact is /app/build/libs/app-all.jar
.
To run the unit tests, execute...
./gradlew clean allUnitTests
End-to-end tests are meant to interact and assert the overall flow of the API is operating correctly. They require that the API to be running already.
To run them, execute...
./gradlew e2e:clean e2e:test
The previous command requires the API to be running already. To help streamline the execution of this flow, a helper Bash script can be executed...
./e2e-execute.sh
This will start the API, wait for it to respond, run the end-to-end tests against that running API, and then stop the API.
These tests are located under the e2e
Gradle sub-project directory. Like any Gradle project, there are the main
and test
directories.
The test
directory contains the tests. The main
directory contains our custom framework that helps us interact with the API.
Load tests are completed with Locust.io. Run the load tests by running...
./load-execute.sh
This will run the API for you, so no need to run it manually.
The locustfile.py
that specifies the load test is located at
./operations/locustfile.py
.
If you want to run the load test in an interactive mode, run...
locust -f ./operations/locustfile.py
The terminal will start a local web interface, and you can enter the swarm parameters for the test and the local url where the app is running (usually http://localhost:8080). You can also set time limits for the tests under 'Advanced Settings'.
We have a number of environments that are split between CDC and non-CDC Azure Entra domains and subscriptions.
The Internal environment is meant to be the Wild West. Meaning anyone can push to it to test something, and there is no requirement that only good builds be pushed to it. Use the Internal environment if you want to test something in a deployed environment in a non-CDC Azure Entra domain and subscription.
To deploy to the Internal environment...
- Check with the team that no one is already using it.
- Find the
internal
branch and delete it inGitHub. - Delete your local
internal
branch if needed.git branch -D internal
- From the branch you want to test, create a new
internal
branch.git checkout -b internal
- Push the branch to GitHub.
git push --set-upstream origin internal
Then the deploy will run.
Remember that you now have the internal
branch checked out locally. If you make subsequent code changes, you will
make them on the internal
branch instead of your original branch.
The Dev environment is similar to the Internal environment but deploys to a CDC Azure Entra domain and subscription. It
is also meant to be the Wild West. Dev deploys similarly to the Internal environment, but you interact with the
dev
branch.
The Staging environment is production-like and meant to be stable. It deploys to a non-CDC Azure Entra domain and
subscription. Deployments occur when a commit is made to the main
branch. main
is a protected branch and requires
PR reviews before merge.
The Prod environment does not exist yet.
There is minimal set-up to do to get Terraform squared away before you can run the Terraform commands in
a new Azure environment in the Flexion Entra domain. For example, the internal
environment. This does not apply to the CDC
Entra domains and subscriptions.
- Create a resource group.
- Create a storage account inside the aforementioned resource group.
- Within the new storage account, create a Container.
- Within Azure Entra...
- Create an App Registration.
- Add federated credentials to the App Registration
repo:CDCgov/trusted-intermediary:ref:refs/heads/main
(for terraform apply).repo:CDCgov/trusted-intermediary:environment:staging
(for staging webapp deploy).- And presumably other repo paths needed in the future for other environments and branches.
- Within your Subscription, assign the Contributor role to the previously created App Registration.
- Add GitHub Action secrets to your GitHub repository.
- A secret with the tenant ID from Azure Entra directory.
- A secret with the ID from the subscription that everything should be deployed into.
- A secret with the ID of the App Registration created previously.
- Create a copy of one of the environments under the operations folder.
- Name the copy off of the name of the new environment.
- Edit the
main.tf
file with the names of the resources previously created:resource_group_name
,storage_account_name
,container_name
. Also update theenvironment
to match the new folder name.
- Create a GitHub Action workflow so that automatic deploys can occur. You can take inspiration from our
Internal environment deployment. Make sure you set the
AZURE_CLIENT_ID
,AZURE_TENANT_ID
, andAZURE_SUBSCRIPTION_ID
based on the secrets created previously.
We use pre-commit
to run some hooks on every commit. These
hooks do linting to ensure things are in a good spot before a commit is made. Please install pre-commit
and then
install the hooks.
pre-commit install
Anyone is encouraged to contribute to the repository by forking and submitting a pull request. (If you are new to GitHub, you might start with a basic tutorial.) By contributing to this project, you grant a world-wide, royalty-free, perpetual, irrevocable, non-exclusive, transferable license to all users under the terms of the Apache Software License v2 or later.
Please read CONTRIBUTING.md
for additional details.
All comments, messages, pull requests, and other submissions received through CDC including this GitHub page may be subject to applicable federal law, including but not limited to the Federal Records Act, and may be archived. Learn more at http://www.cdc.gov/other/privacy.html.
For database documentation go here
- Checkout
main
branch forCDCgov/trusted-intermediary
- Edit the
app/src/main/java/gov/hhs/cdc/trustedintermediary/etor/EtorDomainRegistration.java
file and replace:with:if (ApplicationContext.getEnvironment().equalsIgnoreCase("local")) { ApplicationContext.register(RSEndpointClient.class, MockRSEndpointClient.getInstance()); } else { ApplicationContext.register(RSEndpointClient.class, ReportStreamEndpointClient.getInstance()); ApplicationContext.register(AzureClient.class, AzureClient.getInstance()); }
ApplicationContext.register(RSEndpointClient.class, ReportStreamEndpointClient.getInstance()); ApplicationContext.register(AzureClient.class, AzureClient.getInstance());
- Run TI with
./gradlew clean app:run
For Apple Silicon you will want to enable the Docker option for Use Rosetta for x86/amd64 emulation on Apple Silicon
.
After enabling this option it is recommended that you delete all docker images and containers and rebuild them
with this option enabled.
-
Checkout
master
branch forCDCgov/prime-reportstream
-
CD to
prime-reportstream/prime-router
-
Run the
./cleanslate
script. For more information you can refer to the ReportStream docs -
Run RS with
docker compose up --build -d
-
Run
./gradlew resetDB && ./gradlew reloadTable && ./gradlew reloadSettings
-
Edit
prime-router/settings/staging/0149-etor.yml
- Comment the following two lines under
receivers.transport
:
reportUrl: "https://cdcti-staging-api.azurewebsites.net/v1/etor/orders" authTokenUrl: "https://cdcti-staging-api.azurewebsites.net/v1/auth/token"
- Uncomment the following lines to call the local instance of the API instead:
reportUrl: "http://host.docker.internal:8080/v1/etor/orders" authTokenUrl: "http://host.docker.internal:8080/v1/auth/token"
- Comment the following two lines under
-
Run
./prime multiple-settings set -i ./settings/staging/0149-etor.yml
-
Run
./prime organization addkey --public-key /path/to/trusted-intermediary/mock_credentials/organization-trusted-intermediary-public-key-local.pem --scope "flexion.*.report" --orgName flexion --kid flexion.etor-service-sender --doit
-
Setup local vault secret
-
Go to:
http://localhost:8200/
-
Use token in
prime-router/.vault/env/.env.local
to authenticate -
Go to
Secrets engines
>secret/
>Create secret
- Path for this secret:
FLEXION--ETOR-SERVICE-RECEIVER
- JSON data:
{ "@type": "UserApiKey", "apiKey": "TI's private key in RS at trusted-intermediary/mock_credentials/organization-report-stream-private-key.pem", "user": "flexion" }
- Path for this secret:
-
curl --header 'Content-Type: application/hl7-v2' --header 'Client: flexion.simulated-hospital' --header 'Authorization: Bearer <token>' --data-binary '@/path/to/message.hl7' 'http://localhost:7071/api/waters'
or
curl --header 'Content-Type: application/fhir+ndjson' --header 'Client: flexion.etor-service-sender' --header 'Authorization: Bearer <token>' --data-binary '@/path/to/message.fhir' 'http://localhost:7071/api/waters'
After one or two minutes, check that hl7 files have been dropped to prime-reportstream/prime-router/build/sftp
folder
Note: <token>
should be replaced by the bearer token received from the /api/token
endpoint
We use DORA Metrics to measure our DevOps performance. We currently are tracking Deployment Frequency, Change Fail Rate and Mean Time to Recovery.
The metrics are produced weekly using a Github Action and written into CSV files which are available for download in the workflow job's artifacts.
- Open Practices
- Rules of Behavior
- Thanks and Acknowledgements
- Disclaimer
- Contribution Notice
- Code of Conduct
This repository constitutes a work of the United States Government and is not subject to domestic copyright protection under 17 USC § 105. This repository is in the public domain within the United States, and copyright and related rights in the work worldwide are waived through the CC0 1.0 Universal public domain dedication. All contributions to this repository will be released under the CC0 dedication. By submitting a pull request you are agreeing to comply with this waiver of copyright interest.
The repository utilizes code licensed under the terms of the Apache Software License and therefore is licensed under ASL v2 or later.
This source code in this repository is free: you can redistribute it and/or modify it under the terms of the Apache Software License version 2, or (at your option) any later version.
This source code in this repository is distributed in the hope that it will be useful, but WITHOUT ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the Apache Software License for more details.
You should have received a copy of the Apache Software License along with this program. If not, see http://www.apache.org/licenses/LICENSE-2.0.html
The source code forked from other open source projects will inherit its license.
This repository contains only non-sensitive, publicly available data and information. All material and community participation is covered by the Disclaimer and Code of Conduct. For more information about CDC's privacy policy, please visit http://www.cdc.gov/other/privacy.html.
This repository is not a source of government records, but is a copy to increase collaboration and collaborative potential. All government records will be published through the CDC website.
Please refer to CDC's Template Repository for more information about contributing to this repository, public domain notices and disclaimers, and code of conduct.