Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

My sql destination #114

Merged
merged 2 commits into from
Mar 10, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 7 additions & 0 deletions .env.example
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,13 @@ export POSTGRES_DB=
export POSTGRES_USER=
export POSTGRES_PASSWORD=

# MySQL
export MYSQL_HOST=
export MYSQL_PORT=
export MYSQL_DB=
export MYSQL_USER=
export MYSQL_PASSWORD=

# Webhook
export WEBHOOK_URL=
export WEBHOOK_SECRET=
6 changes: 6 additions & 0 deletions .github/workflows/integration.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -54,5 +54,11 @@ jobs:
- name: Produce data to Postgres with multiple tables
run: docker exec datagen datagen -s /tests/schema2.sql -f postgres -n 3 -rs 1000

- name: Produce data to MySQL with Faker.js
run: docker exec datagen datagen -s /tests/mysql-products.sql -f mysql -n 3

- name: Produce data to MySQL with multiple tables
run: docker exec datagen datagen -s /tests/mysql-schema.sql -f mysql -n 3 -rs 1000

- name: Docker Compose Down
run: docker compose down -v
26 changes: 25 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,7 +85,7 @@ Fake Data Generator
Options:
-V, --version output the version number
-s, --schema <char> Schema file to use
-f, --format <char> The format of the produced data (choices: "json", "avro", "postgres", "webhook", default: "json")
-f, --format <char> The format of the produced data (choices: "json", "avro", "postgres", "webhook", "mysql", default: "json")
-n, --number <char> Number of records to generate. For infinite records, use -1 (default: "10")
-c, --clean Clean (delete) Kafka topics and schema subjects previously created
-dr, --dry-run Dry run (no data will be produced to Kafka)
Expand Down Expand Up @@ -279,6 +279,30 @@ datagen \

> :warning: You can only produce to Postgres with a SQL schema.

#### Producing to MySQL

You can also produce the data to a MySQL database. To do this, you need to specify the `-f mysql` option and provide MySQL connection information in the `.env` file. Here is an example `.env` file:

```
# MySQL
export MYSQL_HOST=
export MYSQL_PORT=
export MYSQL_DB=
export MYSQL_USER=
export MYSQL_PASSWORD=
```

Then, you can run the following command to produce the data to MySQL:

```bash
datagen \
-s tests/products.sql \
-f mysql \
-n 1000
```

> :warning: You can only produce to MySQL with a SQL schema.

#### Producing to Webhook

You can also produce the data to a Webhook. To do this, you need to specify the `-f webhook` option and provide Webhook connection information in the `.env` file. Here is an example `.env` file:
Expand Down
5 changes: 3 additions & 2 deletions datagen.ts
Original file line number Diff line number Diff line change
Expand Up @@ -23,7 +23,7 @@ program
.requiredOption('-s, --schema <char>', 'Schema file to use')
.addOption(
new Option('-f, --format <char>', 'The format of the produced data')
.choices(['json', 'avro', 'postgres', 'webhook'])
.choices(['json', 'avro', 'postgres', 'webhook', 'mysql'])
.default('json')
)
.addOption(
Expand Down Expand Up @@ -58,6 +58,7 @@ global.wait = options.wait;
global.clean = options.clean;
global.dryRun = options.dryRun;
global.prefix = options.prefix;
global.format = options.format;

if (global.debug) {
console.log(options);
Expand Down Expand Up @@ -104,7 +105,7 @@ if (!global.wait) {
process.exit(1);
}

if (global.clean && options.format !== 'postgres' && options.format !== 'webhook') {
if (global.clean && options.format !== 'postgres' && options.format !== 'webhook' && options.format !== 'mysql') {
// Only valid for Kafka
const topics = []
for (const table of parsedSchema) {
Expand Down
16 changes: 16 additions & 0 deletions docker-compose.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,16 @@ services:
ports:
- 5432:5432

mysql:
image: mysql:8.0
environment:
MYSQL_ROOT_PASSWORD: mysql
MYSQL_DATABASE: mysql
MYSQL_USER: mysql
MYSQL_PASSWORD: mysql
ports:
- 3306:3306

datagen:
build: .
container_name: datagen
Expand All @@ -47,6 +57,12 @@ services:
POSTGRES_DB: postgres
POSTGRES_USER: postgres
POSTGRES_PASSWORD: postgres
MYSQL_HOST: mysql
MYSQL_PORT: 3306
MYSQL_DB: mysql
MYSQL_USER: root
MYSQL_PASSWORD: mysql

volumes:
- ./tests:/tests
# Override the entrypoint to run the container and keep it running
Expand Down
101 changes: 101 additions & 0 deletions package-lock.json

Some generated files are not rendered by default. Learn more about how customized files appear on GitHub.

1 change: 1 addition & 0 deletions package.json
Original file line number Diff line number Diff line change
Expand Up @@ -52,6 +52,7 @@
"crypto-random-string": "^5.0.0",
"dotenv": "^16.0.2",
"kafkajs": "^2.2.3",
"mysql2": "^3.9.2",
"node-sql-parser": "^4.6.1",
"pg": "^8.11.0"
},
Expand Down
15 changes: 14 additions & 1 deletion src/dataGenerator.ts
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
import alert from 'cli-alerts';
import postgresDataGenerator from './postgresDataGenerator.js';
import mysqlDataGenerator from './mysqlDataGenerator.js';
import kafkaDataGenerator from './kafkaDataGenerator.js';
import postgresDataGenerator from './postgresDataGenerator.js';
import webhookDataGenerator from './webhookDataGenerator.js';

interface GeneratorOptions {
Expand Down Expand Up @@ -30,6 +31,18 @@ export default async function dataGenerator({

await postgresDataGenerator({ schema, iterations, initialSchema });
break;
case 'mysql':
if (!initialSchema.endsWith('.sql')) {
alert({
type: `error`,
name: `Producing SQL data is only supported with SQL schema files!`,
msg: ``
});
process.exit(1);
}

await mysqlDataGenerator({ schema, iterations, initialSchema });
break;
case 'webhook':
await webhookDataGenerator({ schema, iterations, initialSchema });
break;
Expand Down
Loading
Loading