This project is the codebase for the Insights Storage Service (ISS) of the TransacAI project.
TransacAI project is geared towards generation of enriched summaries and insights of transactional data in real-time or batch using Generative AI and Large Language Models (LLMs). It goes beyond visual and analytical processing of transactional data by generating context-aware and enriched insights in natural language using LLMs. It focuses on delivering human-centric analysis that is easier to understand and act upon, eliminating the need for multiple complex data processing steps to derive insights from raw data.
The Insights Storage Service (ISS) is a GraphQL API service that stores and retrieves insights generated by the TransacAI project.
The Insights Generation Service (IGS) generates insights from transactional data and sends them to ISS for storage. ISS stores these insights in a PostgreSQL database and provides an API for querying and retrieving them.
Frontend clients can use the ISS API to fetch insights for display to users.
ISS API is secured through an API key that is required for all requests.
ISS is a GraphQL API service built using Apollo Server, Prisma Client, and PostgreSQL.
ISS is built using the following technologies:
- Apollo Server: HTTP server for GraphQL APIs
- GraphQL Nexus: GraphQL schema definition and resolver implementation
- Prisma Client: Databases access (ORM)
- Prisma Migrate: Database migrations
- PostgreSQL: Database for storing insights (hosted on Supabase)
The project is structured as follows:
prisma/
: Prisma schema and migrationssrc/
: Source codenexus-types/
: GraphQL Nexus object type definitionsschema.ts
: GraphQL schema generation usingmakeSchema
context.ts
: Apollo Server context setup (with code for API key validation)server.ts
: Apollo Server setup
schema.graphql
: Generated GraphQL schema
The project includes a Dockerfile
and docker-compose.yml
for running the ISS service in a Docker container.
To run the ISS service in a Docker container, follow these steps:
- Build the Docker image:
docker-compose build
- Start the Docker container:
docker-compose up
The ISS service will be running on http://localhost:4000
.
To stop the Docker container, run:
docker-compose down
The project uses environment variables for configuration. The following environment variables are required:
DATABASE_URL
: PostgreSQL database URL (from Supabase) for storing insightsDIRECT_URL
: Database connection URL from Supabase but for direct non-pooling connections (for migrations)TRANSAC_AI_ISS_API_KEY
: API key for securing the ISS API
The project includes Kubernetes manifests for deploying the ISS service to Google Kubernetes Engine (GKE).
The main manifest files are:
kubernetes/deployment.yaml
: Deployment configuration.kubernetes/service.yaml
: Service configuration for load balancer.
Deployment is done on transac-ai-gke
cluster, with a load balancer service to expose the deployment to public access on port 80.
Current policy uses 2 replicas for the deployment, with bare minimum resources for testing purposes.
More information and deployment instructions can be found in the kubernetes/README.md
file.
This project is licensed under the MIT License. See the LICENSE file for details.
If you encounter any issues or bugs while using this project, please report them by following these steps:
- Check if the issue has already been reported by searching our issue tracker.
- If the issue hasn't been reported, create a new issue and provide a detailed description of the problem.
- Include steps to reproduce the issue and any relevant error messages or screenshots.