Skip to content

transac-ai/insights-storage-service

Repository files navigation

TransacAI Insights Storage Service

This project is the codebase for the Insights Storage Service (ISS) of the TransacAI project.

TransacAI

TransacAI project is geared towards generation of enriched summaries and insights of transactional data in real-time or batch using Generative AI and Large Language Models (LLMs). It goes beyond visual and analytical processing of transactional data by generating context-aware and enriched insights in natural language using LLMs. It focuses on delivering human-centric analysis that is easier to understand and act upon, eliminating the need for multiple complex data processing steps to derive insights from raw data.

Insights Storage Service (ISS)

The Insights Storage Service (ISS) is a GraphQL API service that stores and retrieves insights generated by the TransacAI project.

The Insights Generation Service (IGS) generates insights from transactional data and sends them to ISS for storage. ISS stores these insights in a PostgreSQL database and provides an API for querying and retrieving them.

Frontend clients can use the ISS API to fetch insights for display to users.

ISS API is secured through an API key that is required for all requests.

Technical Overview

ISS is a GraphQL API service built using Apollo Server, Prisma Client, and PostgreSQL.

Tech Stack

ISS is built using the following technologies:

Directory Structure

The project is structured as follows:

  • prisma/: Prisma schema and migrations
  • src/: Source code
    • nexus-types/: GraphQL Nexus object type definitions
    • schema.ts: GraphQL schema generation using makeSchema
    • context.ts: Apollo Server context setup (with code for API key validation)
    • server.ts: Apollo Server setup
  • schema.graphql: Generated GraphQL schema

Docker

The project includes a Dockerfile and docker-compose.yml for running the ISS service in a Docker container.

Running the Docker Container

To run the ISS service in a Docker container, follow these steps:

  1. Build the Docker image:
docker-compose build
  1. Start the Docker container:
docker-compose up

The ISS service will be running on http://localhost:4000.

Stopping the Docker Container

To stop the Docker container, run:

docker-compose down

Environment Variables

The project uses environment variables for configuration. The following environment variables are required:

  • DATABASE_URL: PostgreSQL database URL (from Supabase) for storing insights
  • DIRECT_URL: Database connection URL from Supabase but for direct non-pooling connections (for migrations)
  • TRANSAC_AI_ISS_API_KEY: API key for securing the ISS API

Google Kubernetes Engine

The project includes Kubernetes manifests for deploying the ISS service to Google Kubernetes Engine (GKE).

The main manifest files are:

  • kubernetes/deployment.yaml: Deployment configuration.
  • kubernetes/service.yaml: Service configuration for load balancer.

Deployment is done on transac-ai-gke cluster, with a load balancer service to expose the deployment to public access on port 80.

Current policy uses 2 replicas for the deployment, with bare minimum resources for testing purposes.

More information and deployment instructions can be found in the kubernetes/README.md file.

License

This project is licensed under the MIT License. See the LICENSE file for details.

Issues

If you encounter any issues or bugs while using this project, please report them by following these steps:

  1. Check if the issue has already been reported by searching our issue tracker.
  2. If the issue hasn't been reported, create a new issue and provide a detailed description of the problem.
  3. Include steps to reproduce the issue and any relevant error messages or screenshots.

Open Issue