Skip to content
jlasky2 edited this page Feb 7, 2024 · 13 revisions

Introduction

TRAM home screen

Threat Report ATT&CK Mapper (TRAM) is an open-source platform designed to to reduce cost and increase the effectiveness of integrating ATT&CK across the CTI community. It does this by automating the mapping of cyber threat intelligence (CTI) reports to MITRE ATT&CK®. Threat intel analysts, providers, and platforms can use TRAM to integrate ATT&CK more easily and consistently into their products.

The platform works out of the box to identify up to 50 common ATT&CK techniques in text documents; it also supports tailoring the model by annotating additional items and rebuilding the model. This Wiki describes the results of the Center for Threat-Informed Defense (CTID) research into automated ATT&CK mapping and provides details and instructions for tailoring the platform to your organization's unique dataset.

Background

Mapping adversary tactics, techniques, and procedures (TTPs) found in CTI reports to the MITRE ATT&CK knowledge base is difficult, error-prone, and time-consuming. The TRAM project goal is to automate this process by providing these foundational resources:

  1. A methodology and toolset for creating the annotations required for machine learning.
  2. An open-source, annotated dataset that covers 50 ATT&CK techniques.
  3. A pre-trained Large Language Model (LLM) for labeling ATT&CK techniques found in human-readable CTI reports.
  4. A web application for running reports through the machine learning model and viewing the results.

Warning: TRAM is an experimental research project. It requires infrastructure and DevOps experiences to get it running in your own environment. Also as a machine learning project, it does make mistakes such as labeling techniques incorrectly or missing techniques. We recommend TRAM for research purposes, but not for operational use cases.

Getting started

The primary way to use TRAM is to install the web application. For machine learning experts who want to customize the large language model, we offer a collection of Jupyter notebooks.

  • Web App Installation: The web application lets you upload documents, run the machine learning system, and view the techniques that it discovers.

  • Jupyter Notebooks These notebooks contain code that you can run to fine-tune each model with additional data. Fine tuning requires high-end GPUs in your environment, or you can use the "Open in Colab" button in the notebook to run it on Google's Colab service, which offers both paid and free tiers for accessing GPUs.

Clone this wiki locally