diff --git a/changelogs/drizzle-kit/0.25.0.md b/changelogs/drizzle-kit/0.25.0.md new file mode 100644 index 000000000..fc4b36c83 --- /dev/null +++ b/changelogs/drizzle-kit/0.25.0.md @@ -0,0 +1,144 @@ +## Breaking changes and migrate guide for Turso users + +If you are using Turso and libsql, you will need to upgrade your `drizzle.config` and `@libsql/client` package. + +1. This version of drizzle-orm will only work with `@libsql/client@0.10.0` or higher if you are using the `migrate` function. For other use cases, you can continue using previous versions(But the suggestion is to upgrade) +To install the latest version, use the command: + +```bash +npm i @libsql/client@latest +``` + +2. Previously, we had a common `drizzle.config` for SQLite and Turso users, which allowed a shared strategy for both dialects. Starting with this release, we are introducing the turso dialect in drizzle-kit. We will evolve and improve Turso as a separate dialect with its own migration strategies. + +**Before** + +```ts +import { defineConfig } from "drizzle-kit"; + +export default defineConfig({ + dialect: "sqlite", + schema: "./schema.ts", + out: "./drizzle", + dbCredentials: { + url: "database.db", + }, + breakpoints: true, + verbose: true, + strict: true, +}); +``` + +**After** + +```ts +import { defineConfig } from "drizzle-kit"; + +export default defineConfig({ + dialect: "turso", + schema: "./schema.ts", + out: "./drizzle", + dbCredentials: { + url: "database.db", + }, + breakpoints: true, + verbose: true, + strict: true, +}); +``` + +If you are using only SQLite, you can use `dialect: "sqlite"` + +## LibSQL/Turso and Sqlite migration updates + +### SQLite "generate" and "push" statements updates + +Starting from this release, we will no longer generate comments like this: + +```sql + '/*\n SQLite does not support "Changing existing column type" out of the box, we do not generate automatic migration for that, so it has to be done manually' + + '\n Please refer to: https://www.techonthenet.com/sqlite/tables/alter_table.php' + + '\n https://www.sqlite.org/lang_altertable.html' + + '\n https://stackoverflow.com/questions/2083543/modify-a-columns-type-in-sqlite3' + + "\n\n Due to that we don't generate migration automatically and it has to be done manually" + + '\n*/' +``` + +We will generate a set of statements, and you can decide if it's appropriate to create data-moving statements instead. Here is an example of the SQL file you'll receive now: + +```sql +PRAGMA foreign_keys=OFF; +--> statement-breakpoint +CREATE TABLE `__new_worker` ( + `id` integer PRIMARY KEY NOT NULL, + `name` text NOT NULL, + `salary` text NOT NULL, + `job_id` integer, + FOREIGN KEY (`job_id`) REFERENCES `job`(`id`) ON UPDATE no action ON DELETE no action +); +--> statement-breakpoint +INSERT INTO `__new_worker`("id", "name", "salary", "job_id") SELECT "id", "name", "salary", "job_id" FROM `worker`; +--> statement-breakpoint +DROP TABLE `worker`; +--> statement-breakpoint +ALTER TABLE `__new_worker` RENAME TO `worker`; +--> statement-breakpoint +PRAGMA foreign_keys=ON; +``` + +### LibSQL/Turso "generate" and "push" statements updates + +Since LibSQL supports more ALTER statements than SQLite, we can generate more statements without recreating your schema and moving all the data, which can be potentially dangerous for production environments. + +LibSQL and Turso will now have a separate dialect in the Drizzle config file, meaning that we will evolve Turso and LibSQL independently from SQLite and will aim to support as many features as Turso/LibSQL offer. + +With the updated LibSQL migration strategy, you will have the ability to: + +- **Change Data Type**: Set a new data type for existing columns. +- **Set and Drop Default Values**: Add or remove default values for existing columns. +- **Set and Drop NOT NULL**: Add or remove the NOT NULL constraint on existing columns. +- **Add References to Existing Columns**: Add foreign key references to existing columns + +You can find more information in the [LibSQL documentation](https://github.com/tursodatabase/libsql/blob/main/libsql-sqlite3/doc/libsql_extensions.md#altering-columns) + +### LIMITATIONS + +- Dropping or altering an index will cause table recreation. + +This is because LibSQL/Turso does not support dropping this type of index. + +```sql +CREATE TABLE `users` ( + `id` integer NOT NULL, + `name` integer, + `age` integer PRIMARY KEY NOT NULL + FOREIGN KEY (`name`) REFERENCES `users1`("id") ON UPDATE no action ON DELETE no action +); +``` + +- If the table has indexes, altering columns will cause table recreation. +- Drizzle-Kit will drop the indexes, modify the columns, and then recreate the indexes. +- Adding or dropping composite foreign keys is not supported and will cause table recreation + +### NOTES + +- You can create a reference on any column type, but if you want to insert values, the referenced column must have a unique index or primary key. + +```sql +CREATE TABLE parent(a PRIMARY KEY, b UNIQUE, c, d, e, f); +CREATE UNIQUE INDEX i1 ON parent(c, d); +CREATE INDEX i2 ON parent(e); +CREATE UNIQUE INDEX i3 ON parent(f COLLATE nocase); + +CREATE TABLE child1(f, g REFERENCES parent(a)); -- Ok +CREATE TABLE child2(h, i REFERENCES parent(b)); -- Ok +CREATE TABLE child3(j, k, FOREIGN KEY(j, k) REFERENCES parent(c, d)); -- Ok +CREATE TABLE child4(l, m REFERENCES parent(e)); -- Error! +CREATE TABLE child5(n, o REFERENCES parent(f)); -- Error! +CREATE TABLE child6(p, q, FOREIGN KEY(p, q) REFERENCES parent(b, c)); -- Error! +CREATE TABLE child7(r REFERENCES parent(c)); -- Error! +``` + +> **NOTE**: The foreign key for the table child5 is an error because, although the parent key column has a unique index, the index uses a different collating sequence. + +See more: https://www.sqlite.org/foreignkeys.html \ No newline at end of file diff --git a/changelogs/drizzle-orm/0.34.0.md b/changelogs/drizzle-orm/0.34.0.md new file mode 100644 index 000000000..490422628 --- /dev/null +++ b/changelogs/drizzle-orm/0.34.0.md @@ -0,0 +1,255 @@ +## Breaking changes and migrate guide for Turso users + +If you are using Turso and libsql, you will need to upgrade your `drizzle.config` and `@libsql/client` package. + +1. This version of drizzle-orm will only work with `@libsql/client@0.10.0` or higher if you are using the `migrate` function. For other use cases, you can continue using previous versions(But the suggestion is to upgrade) +To install the latest version, use the command: + +```bash +npm i @libsql/client@latest +``` + +2. Previously, we had a common `drizzle.config` for SQLite and Turso users, which allowed a shared strategy for both dialects. Starting with this release, we are introducing the turso dialect in drizzle-kit. We will evolve and improve Turso as a separate dialect with its own migration strategies. + +**Before** + +```ts +import { defineConfig } from "drizzle-kit"; + +export default defineConfig({ + dialect: "sqlite", + schema: "./schema.ts", + out: "./drizzle", + dbCredentials: { + url: "database.db", + }, + breakpoints: true, + verbose: true, + strict: true, +}); +``` + +**After** + +```ts +import { defineConfig } from "drizzle-kit"; + +export default defineConfig({ + dialect: "turso", + schema: "./schema.ts", + out: "./drizzle", + dbCredentials: { + url: "database.db", + }, + breakpoints: true, + verbose: true, + strict: true, +}); +``` + +If you are using only SQLite, you can use `dialect: "sqlite"` + +## LibSQL/Turso and Sqlite migration updates + +### SQLite "generate" and "push" statements updates + +Starting from this release, we will no longer generate comments like this: + +```sql + '/*\n SQLite does not support "Changing existing column type" out of the box, we do not generate automatic migration for that, so it has to be done manually' + + '\n Please refer to: https://www.techonthenet.com/sqlite/tables/alter_table.php' + + '\n https://www.sqlite.org/lang_altertable.html' + + '\n https://stackoverflow.com/questions/2083543/modify-a-columns-type-in-sqlite3' + + "\n\n Due to that we don't generate migration automatically and it has to be done manually" + + '\n*/' +``` + +We will generate a set of statements, and you can decide if it's appropriate to create data-moving statements instead. Here is an example of the SQL file you'll receive now: + +```sql +PRAGMA foreign_keys=OFF; +--> statement-breakpoint +CREATE TABLE `__new_worker` ( + `id` integer PRIMARY KEY NOT NULL, + `name` text NOT NULL, + `salary` text NOT NULL, + `job_id` integer, + FOREIGN KEY (`job_id`) REFERENCES `job`(`id`) ON UPDATE no action ON DELETE no action +); +--> statement-breakpoint +INSERT INTO `__new_worker`("id", "name", "salary", "job_id") SELECT "id", "name", "salary", "job_id" FROM `worker`; +--> statement-breakpoint +DROP TABLE `worker`; +--> statement-breakpoint +ALTER TABLE `__new_worker` RENAME TO `worker`; +--> statement-breakpoint +PRAGMA foreign_keys=ON; +``` + +### LibSQL/Turso "generate" and "push" statements updates + +Since LibSQL supports more ALTER statements than SQLite, we can generate more statements without recreating your schema and moving all the data, which can be potentially dangerous for production environments. + +LibSQL and Turso will now have a separate dialect in the Drizzle config file, meaning that we will evolve Turso and LibSQL independently from SQLite and will aim to support as many features as Turso/LibSQL offer. + +With the updated LibSQL migration strategy, you will have the ability to: + +- **Change Data Type**: Set a new data type for existing columns. +- **Set and Drop Default Values**: Add or remove default values for existing columns. +- **Set and Drop NOT NULL**: Add or remove the NOT NULL constraint on existing columns. +- **Add References to Existing Columns**: Add foreign key references to existing columns + +You can find more information in the [LibSQL documentation](https://github.com/tursodatabase/libsql/blob/main/libsql-sqlite3/doc/libsql_extensions.md#altering-columns) + +### LIMITATIONS + +- Dropping or altering an index will cause table recreation. + +This is because LibSQL/Turso does not support dropping this type of index. + +```sql +CREATE TABLE `users` ( + `id` integer NOT NULL, + `name` integer, + `age` integer PRIMARY KEY NOT NULL + FOREIGN KEY (`name`) REFERENCES `users1`("id") ON UPDATE no action ON DELETE no action +); +``` + +- If the table has indexes, altering columns will cause table recreation. +- Drizzle-Kit will drop the indexes, modify the columns, and then recreate the indexes. +- Adding or dropping composite foreign keys is not supported and will cause table recreation + +### NOTES + +- You can create a reference on any column type, but if you want to insert values, the referenced column must have a unique index or primary key. + +```sql +CREATE TABLE parent(a PRIMARY KEY, b UNIQUE, c, d, e, f); +CREATE UNIQUE INDEX i1 ON parent(c, d); +CREATE INDEX i2 ON parent(e); +CREATE UNIQUE INDEX i3 ON parent(f COLLATE nocase); + +CREATE TABLE child1(f, g REFERENCES parent(a)); -- Ok +CREATE TABLE child2(h, i REFERENCES parent(b)); -- Ok +CREATE TABLE child3(j, k, FOREIGN KEY(j, k) REFERENCES parent(c, d)); -- Ok +CREATE TABLE child4(l, m REFERENCES parent(e)); -- Error! +CREATE TABLE child5(n, o REFERENCES parent(f)); -- Error! +CREATE TABLE child6(p, q, FOREIGN KEY(p, q) REFERENCES parent(b, c)); -- Error! +CREATE TABLE child7(r REFERENCES parent(c)); -- Error! +``` + +> **NOTE**: The foreign key for the table child5 is an error because, although the parent key column has a unique index, the index uses a different collating sequence. + +See more: https://www.sqlite.org/foreignkeys.html + +## A new and easy way to start using drizzle + +Current and the only way to do, is to define client yourself and pass it to drizzle + +```ts +const client = new Pool({ url: '' }); +drizzle(client, { logger: true }); +``` + +But we want to introduce you to a new API, which is a simplified method in addition to the existing one. + +Most clients will have a few options to connect, starting with the easiest and most common one, and allowing you to control your client connection as needed. + +Let's use `node-postgres` as an example, but the same pattern can be applied to all other clients + +```ts +// Finally, one import for all available clients and dialects! +import { drizzle } from 'drizzle-orm' + +// Choose a client and use a connection URL — nothing else is needed! +const db1 = await drizzle("node-postgres", process.env.POSTGRES_URL); + +// If you need to pass a logger, schema, or other configurations, you can use an object and specify the client-specific URL in the connection +const db2 = await drizzle("node-postgres", { + connection: process.env.POSTGRES_URL, + logger: true +}); + +// And finally, if you need to use full client/driver-specific types in connections, you can use a URL or host/port/etc. as an object inferred from the underlying client connection types +const db3 = await drizzle("node-postgres", { + connection: { + connectionString: process.env.POSTGRES_URL, + }, +}); + +const db4 = await drizzle("node-postgres", { + connection: { + user: process.env.DB_USER, + password: process.env.DB_PASSWORD, + host: process.env.DB_HOST, + port: process.env.DB_PORT, + database: process.env.DB_NAME, + ssl: true, + }, +}); +``` + +A few clients will have a slightly different API due to their specific behavior. Let's take a look at them: + +For `aws-data-api-pg`, Drizzle will require `resourceArn`, `database`, and `secretArn`, along with any other AWS Data API client types for the connection, such as credentials, region, etc. + +```ts +drizzle("aws-data-api-pg", { + connection: { + resourceArn: "", + database: "", + secretArn: "", + }, +}); +``` + +For `d1`, the Cloudflare Worker types as described in the [documentation](https://developers.cloudflare.com/d1/get-started/) here will be required. + +```ts +drizzle("d1", { + connection: env.DB // Cloudflare Worker Types +}) +``` + +For `vercel-postgres`, nothing is needed since Vercel automatically retrieves the `POSTGRES_URL` from the `.env` file. You can check this [documentation](https://vercel.com/docs/storage/vercel-postgres/quickstart) for more info + +```ts +drizzle("vercel-postgres") +``` + +> Note that the first example with the client is still available and not deprecated. You can use it if you don't want to await the drizzle object. The new way of defining drizzle is designed to make it easier to import from one place and get autocomplete for all the available clients + +## New "count" API + +Befor this release to count entities in a table, you would need to do this: + +```ts +const res = await db.select({ count: sql`count(*)` }).from(users); +const count = res[0].count; +``` + +The new API will look like this: + +```ts +// how many users are in the database +const count: number = await db.$count(users); + +// how many users with the name "Dan" are in the database +const count: number = await db.$count(users, eq(name, "Dan")); +``` + +This can also work as a subquery and within relational queries + +```ts +const users = await db.select({ + ...users, + postsCount: db.$count(posts, eq(posts.authorId, users.id)) +}); + +const users = await db.query.users.findMany({ + extras: { + postsCount: db.$count(posts, eq(posts.authorId, users.id)) + } +}) +``` diff --git a/drizzle-kit/build.ts b/drizzle-kit/build.ts index 701e9c84c..ec7fc76c0 100644 --- a/drizzle-kit/build.ts +++ b/drizzle-kit/build.ts @@ -1,3 +1,4 @@ +/// import * as esbuild from 'esbuild'; import { readFileSync, writeFileSync } from 'node:fs'; import * as tsup from 'tsup'; @@ -16,6 +17,7 @@ const driversPackages = [ // sqlite drivers '@libsql/client', 'better-sqlite3', + 'bun:sqlite', ]; esbuild.buildSync({ @@ -82,6 +84,7 @@ const main = async () => { await tsup.build({ entryPoints: ['./src/index.ts', './src/api.ts'], outDir: './dist', + external: ['bun:sqlite'], splitting: false, dts: true, format: ['cjs', 'esm'], diff --git a/drizzle-kit/package.json b/drizzle-kit/package.json index 9d9e1d227..66f19e6be 100644 --- a/drizzle-kit/package.json +++ b/drizzle-kit/package.json @@ -1,6 +1,6 @@ { "name": "drizzle-kit", - "version": "0.24.2", + "version": "0.25.0", "homepage": "https://orm.drizzle.team", "keywords": [ "drizzle", @@ -54,7 +54,7 @@ "@electric-sql/pglite": "^0.1.5", "@hono/node-server": "^1.9.0", "@hono/zod-validator": "^0.2.1", - "@libsql/client": "^0.4.2", + "@libsql/client": "^0.10.0", "@neondatabase/serverless": "^0.9.1", "@originjs/vite-plugin-commonjs": "^1.0.3", "@planetscale/database": "^1.16.0", @@ -74,6 +74,7 @@ "@vercel/postgres": "^0.8.0", "ava": "^5.1.0", "better-sqlite3": "^9.4.3", + "bun-types": "^0.6.6", "camelcase": "^7.0.1", "chalk": "^5.2.0", "commander": "^12.1.0", diff --git a/drizzle-kit/schema.ts b/drizzle-kit/schema.ts deleted file mode 100644 index e69de29bb..000000000 diff --git a/drizzle-kit/src/cli/commands/introspect.ts b/drizzle-kit/src/cli/commands/introspect.ts index 4e51d6b49..9b5a044f6 100644 --- a/drizzle-kit/src/cli/commands/introspect.ts +++ b/drizzle-kit/src/cli/commands/introspect.ts @@ -25,6 +25,7 @@ import { } from '../../snapshotsDiffer'; import { prepareOutFolder } from '../../utils'; import type { Casing, Prefix } from '../validations/common'; +import { LibSQLCredentials } from '../validations/libsql'; import type { MysqlCredentials } from '../validations/mysql'; import type { PostgresCredentials } from '../validations/postgres'; import { SingleStoreCredentials } from '../validations/singlestore'; @@ -210,7 +211,7 @@ export const introspectMysql = async ( writeFileSync(relationsFile, relationsTs.file); console.log(); - const { snapshots, journal } = prepareOutFolder(out, 'postgresql'); + const { snapshots, journal } = prepareOutFolder(out, 'mysql'); if (snapshots.length === 0) { const { sqlStatements, _meta } = await applyMysqlSnapshotsDiff( @@ -417,7 +418,118 @@ export const introspectSqlite = async ( writeFileSync(relationsFile, relationsTs.file); console.log(); - const { snapshots, journal } = prepareOutFolder(out, 'postgresql'); + const { snapshots, journal } = prepareOutFolder(out, 'sqlite'); + + if (snapshots.length === 0) { + const { sqlStatements, _meta } = await applySqliteSnapshotsDiff( + squashSqliteScheme(drySQLite), + squashSqliteScheme(schema), + tablesResolver, + columnsResolver, + drySQLite, + schema, + ); + + writeResult({ + cur: schema, + sqlStatements, + journal, + _meta, + outFolder: out, + breakpoints, + type: 'introspect', + prefixMode: prefix, + }); + } else { + render( + `[${ + chalk.blue( + 'i', + ) + }] No SQL generated, you already have migrations in project`, + ); + } + + render( + `[${ + chalk.green( + '✓', + ) + }] You schema file is ready ➜ ${chalk.bold.underline.blue(schemaFile)} 🚀`, + ); + render( + `[${ + chalk.green( + '✓', + ) + }] You relations file is ready ➜ ${ + chalk.bold.underline.blue( + relationsFile, + ) + } 🚀`, + ); + process.exit(0); +}; + +export const introspectLibSQL = async ( + casing: Casing, + out: string, + breakpoints: boolean, + credentials: LibSQLCredentials, + tablesFilter: string[], + prefix: Prefix, +) => { + const { connectToLibSQL } = await import('../connections'); + const db = await connectToLibSQL(credentials); + + const matchers = tablesFilter.map((it) => { + return new Minimatch(it); + }); + + const filter = (tableName: string) => { + if (matchers.length === 0) return true; + + let flags: boolean[] = []; + + for (let matcher of matchers) { + if (matcher.negate) { + if (!matcher.match(tableName)) { + flags.push(false); + } + } + + if (matcher.match(tableName)) { + flags.push(true); + } + } + + if (flags.length > 0) { + return flags.every(Boolean); + } + return false; + }; + + const progress = new IntrospectProgress(); + const res = await renderWithTask( + progress, + fromSqliteDatabase(db, filter, (stage, count, status) => { + progress.update(stage, count, status); + }), + ); + + const schema = { id: originUUID, prevId: '', ...res } as SQLiteSchema; + const ts = sqliteSchemaToTypeScript(schema, casing); + const relationsTs = relationsToTypeScript(schema, casing); + + // check orm and orm-pg api version + + const schemaFile = join(out, 'schema.ts'); + writeFileSync(schemaFile, ts.file); + const relationsFile = join(out, 'relations.ts'); + writeFileSync(relationsFile, relationsTs.file); + console.log(); + + const { snapshots, journal } = prepareOutFolder(out, 'sqlite'); if (snapshots.length === 0) { const { sqlStatements, _meta } = await applySqliteSnapshotsDiff( diff --git a/drizzle-kit/src/cli/commands/libSqlPushUtils.ts b/drizzle-kit/src/cli/commands/libSqlPushUtils.ts new file mode 100644 index 000000000..01bb61334 --- /dev/null +++ b/drizzle-kit/src/cli/commands/libSqlPushUtils.ts @@ -0,0 +1,346 @@ +import chalk from 'chalk'; + +import { JsonStatement } from 'src/jsonStatements'; +import { findAddedAndRemoved, SQLiteDB } from 'src/utils'; +import { SQLiteSchemaInternal, SQLiteSchemaSquashed, SQLiteSquasher } from '../../serializer/sqliteSchema'; +import { + CreateSqliteIndexConvertor, + fromJson, + LibSQLModifyColumn, + SQLiteCreateTableConvertor, + SQLiteDropTableConvertor, + SqliteRenameTableConvertor, +} from '../../sqlgenerator'; + +export const getOldTableName = ( + tableName: string, + meta: SQLiteSchemaInternal['_meta'], +) => { + for (const key of Object.keys(meta.tables)) { + const value = meta.tables[key]; + if (`"${tableName}"` === value) { + return key.substring(1, key.length - 1); + } + } + return tableName; +}; + +export const _moveDataStatements = ( + tableName: string, + json: SQLiteSchemaSquashed, + dataLoss: boolean = false, +) => { + const statements: string[] = []; + + const newTableName = `__new_${tableName}`; + + // create table statement from a new json2 with proper name + const tableColumns = Object.values(json.tables[tableName].columns); + const referenceData = Object.values(json.tables[tableName].foreignKeys); + const compositePKs = Object.values( + json.tables[tableName].compositePrimaryKeys, + ).map((it) => SQLiteSquasher.unsquashPK(it)); + + const fks = referenceData.map((it) => SQLiteSquasher.unsquashPushFK(it)); + + // create new table + statements.push( + new SQLiteCreateTableConvertor().convert({ + type: 'sqlite_create_table', + tableName: newTableName, + columns: tableColumns, + referenceData: fks, + compositePKs, + }), + ); + + // move data + if (!dataLoss) { + const columns = Object.keys(json.tables[tableName].columns).map( + (c) => `"${c}"`, + ); + + statements.push( + `INSERT INTO \`${newTableName}\`(${ + columns.join( + ', ', + ) + }) SELECT ${columns.join(', ')} FROM \`${tableName}\`;`, + ); + } + + statements.push( + new SQLiteDropTableConvertor().convert({ + type: 'drop_table', + tableName: tableName, + schema: '', + }), + ); + + // rename table + statements.push( + new SqliteRenameTableConvertor().convert({ + fromSchema: '', + tableNameFrom: newTableName, + tableNameTo: tableName, + toSchema: '', + type: 'rename_table', + }), + ); + + for (const idx of Object.values(json.tables[tableName].indexes)) { + statements.push( + new CreateSqliteIndexConvertor().convert({ + type: 'create_index', + tableName: tableName, + schema: '', + data: idx, + }), + ); + } + return statements; +}; + +export const libSqlLogSuggestionsAndReturn = async ( + connection: SQLiteDB, + statements: JsonStatement[], + json1: SQLiteSchemaSquashed, + json2: SQLiteSchemaSquashed, + meta: SQLiteSchemaInternal['_meta'], +) => { + let shouldAskForApprove = false; + const statementsToExecute: string[] = []; + const infoToPrint: string[] = []; + + const tablesToRemove: string[] = []; + const columnsToRemove: string[] = []; + const tablesToTruncate: string[] = []; + + for (const statement of statements) { + if (statement.type === 'drop_table') { + const res = await connection.query<{ count: string }>( + `select count(*) as count from \`${statement.tableName}\``, + ); + const count = Number(res[0].count); + if (count > 0) { + infoToPrint.push( + `· You're about to delete ${ + chalk.underline( + statement.tableName, + ) + } table with ${count} items`, + ); + tablesToRemove.push(statement.tableName); + shouldAskForApprove = true; + } + const fromJsonStatement = fromJson([statement], 'turso', 'push', json2); + statementsToExecute.push( + ...(Array.isArray(fromJsonStatement) ? fromJsonStatement : [fromJsonStatement]), + ); + } else if (statement.type === 'alter_table_drop_column') { + const tableName = statement.tableName; + + const res = await connection.query<{ count: string }>( + `select count(*) as count from \`${tableName}\``, + ); + const count = Number(res[0].count); + if (count > 0) { + infoToPrint.push( + `· You're about to delete ${ + chalk.underline( + statement.columnName, + ) + } column in ${tableName} table with ${count} items`, + ); + columnsToRemove.push(`${tableName}_${statement.columnName}`); + shouldAskForApprove = true; + } + + const fromJsonStatement = fromJson([statement], 'turso', 'push', json2); + statementsToExecute.push( + ...(Array.isArray(fromJsonStatement) ? fromJsonStatement : [fromJsonStatement]), + ); + } else if ( + statement.type === 'sqlite_alter_table_add_column' + && statement.column.notNull + && !statement.column.default + ) { + const newTableName = statement.tableName; + const res = await connection.query<{ count: string }>( + `select count(*) as count from \`${newTableName}\``, + ); + const count = Number(res[0].count); + if (count > 0) { + infoToPrint.push( + `· You're about to add not-null ${ + chalk.underline( + statement.column.name, + ) + } column without default value, which contains ${count} items`, + ); + + tablesToTruncate.push(newTableName); + statementsToExecute.push(`delete from ${newTableName};`); + + shouldAskForApprove = true; + } + + const fromJsonStatement = fromJson([statement], 'turso', 'push', json2); + statementsToExecute.push( + ...(Array.isArray(fromJsonStatement) ? fromJsonStatement : [fromJsonStatement]), + ); + } else if (statement.type === 'alter_table_alter_column_set_notnull') { + const tableName = statement.tableName; + + if ( + statement.type === 'alter_table_alter_column_set_notnull' + && typeof statement.columnDefault === 'undefined' + ) { + const res = await connection.query<{ count: string }>( + `select count(*) as count from \`${tableName}\``, + ); + const count = Number(res[0].count); + if (count > 0) { + infoToPrint.push( + `· You're about to add not-null constraint to ${ + chalk.underline( + statement.columnName, + ) + } column without default value, which contains ${count} items`, + ); + + tablesToTruncate.push(tableName); + statementsToExecute.push(`delete from \`${tableName}\``); + shouldAskForApprove = true; + } + } + + const modifyStatements = new LibSQLModifyColumn().convert(statement, json2); + + statementsToExecute.push( + ...(Array.isArray(modifyStatements) ? modifyStatements : [modifyStatements]), + ); + } else if (statement.type === 'recreate_table') { + const tableName = statement.tableName; + + let dataLoss = false; + + const oldTableName = getOldTableName(tableName, meta); + + const prevColumnNames = Object.keys(json1.tables[oldTableName].columns); + const currentColumnNames = Object.keys(json2.tables[tableName].columns); + const { removedColumns, addedColumns } = findAddedAndRemoved( + prevColumnNames, + currentColumnNames, + ); + + if (removedColumns.length) { + for (const removedColumn of removedColumns) { + const res = await connection.query<{ count: string }>( + `select count(\`${tableName}\`.\`${removedColumn}\`) as count from \`${tableName}\``, + ); + + const count = Number(res[0].count); + if (count > 0) { + infoToPrint.push( + `· You're about to delete ${ + chalk.underline( + removedColumn, + ) + } column in ${tableName} table with ${count} items`, + ); + columnsToRemove.push(removedColumn); + shouldAskForApprove = true; + } + } + } + + if (addedColumns.length) { + for (const addedColumn of addedColumns) { + const [res] = await connection.query<{ count: string }>( + `select count(*) as count from \`${tableName}\``, + ); + + const columnConf = json2.tables[tableName].columns[addedColumn]; + + const count = Number(res.count); + if (count > 0 && columnConf.notNull && !columnConf.default) { + dataLoss = true; + + infoToPrint.push( + `· You're about to add not-null ${ + chalk.underline( + addedColumn, + ) + } column without default value to table, which contains ${count} items`, + ); + shouldAskForApprove = true; + tablesToTruncate.push(tableName); + + statementsToExecute.push(`DELETE FROM \`${tableName}\`;`); + } + } + } + + // check if some tables referencing current for pragma + const tablesReferencingCurrent: string[] = []; + + for (const table of Object.values(json2.tables)) { + const tablesRefs = Object.values(json2.tables[table.name].foreignKeys) + .filter((t) => SQLiteSquasher.unsquashPushFK(t).tableTo === tableName) + .map((it) => SQLiteSquasher.unsquashPushFK(it).tableFrom); + + tablesReferencingCurrent.push(...tablesRefs); + } + + if (!tablesReferencingCurrent.length) { + statementsToExecute.push(..._moveDataStatements(tableName, json2, dataLoss)); + continue; + } + + // recreate table + statementsToExecute.push( + ..._moveDataStatements(tableName, json2, dataLoss), + ); + } else if ( + statement.type === 'alter_table_alter_column_set_generated' + || statement.type === 'alter_table_alter_column_drop_generated' + ) { + const tableName = statement.tableName; + + const res = await connection.query<{ count: string }>( + `select count("${statement.columnName}") as count from \`${tableName}\``, + ); + const count = Number(res[0].count); + if (count > 0) { + infoToPrint.push( + `· You're about to delete ${ + chalk.underline( + statement.columnName, + ) + } column in ${tableName} table with ${count} items`, + ); + columnsToRemove.push(`${tableName}_${statement.columnName}`); + shouldAskForApprove = true; + } + const fromJsonStatement = fromJson([statement], 'turso', 'push', json2); + statementsToExecute.push( + ...(Array.isArray(fromJsonStatement) ? fromJsonStatement : [fromJsonStatement]), + ); + } else { + const fromJsonStatement = fromJson([statement], 'turso', 'push', json2); + statementsToExecute.push( + ...(Array.isArray(fromJsonStatement) ? fromJsonStatement : [fromJsonStatement]), + ); + } + } + + return { + statementsToExecute: [...new Set(statementsToExecute)], + shouldAskForApprove, + infoToPrint, + columnsToRemove: [...new Set(columnsToRemove)], + tablesToTruncate: [...new Set(tablesToTruncate)], + tablesToRemove: [...new Set(tablesToRemove)], + }; +}; diff --git a/drizzle-kit/src/cli/commands/migrate.ts b/drizzle-kit/src/cli/commands/migrate.ts index 0672fe734..6fd9120ae 100644 --- a/drizzle-kit/src/cli/commands/migrate.ts +++ b/drizzle-kit/src/cli/commands/migrate.ts @@ -20,6 +20,7 @@ import { MySqlSchema, mysqlSchema, squashMysqlScheme } from '../../serializer/my import { PgSchema, pgSchema, squashPgScheme } from '../../serializer/pgSchema'; import { SQLiteSchema, sqliteSchema, squashSqliteScheme } from '../../serializer/sqliteSchema'; import { + applyLibSQLSnapshotsDiff, applyMysqlSnapshotsDiff, applyPgSnapshotsDiff, applySingleStoreSnapshotsDiff, @@ -395,6 +396,58 @@ export const prepareAndMigrateMysql = async (config: GenerateConfig) => { } }; +// Not needed for now +function mySingleStoreSchemaSuggestions( + curSchema: TypeOf, + prevSchema: TypeOf, +) { + const suggestions: string[] = []; + const usedSuggestions: string[] = []; + const suggestionTypes = { + // TODO: Check if SingleStore has serial type + serial: withStyle.errorWarning( + `We deprecated the use of 'serial' for SingleStore starting from version 0.20.0. In SingleStore, 'serial' is simply an alias for 'bigint unsigned not null auto_increment unique,' which creates all constraints and indexes for you. This may make the process less explicit for both users and drizzle-kit push commands`, + ), + }; + + for (const table of Object.values(curSchema.tables)) { + for (const column of Object.values(table.columns)) { + if (column.type === 'serial') { + if (!usedSuggestions.includes('serial')) { + suggestions.push(suggestionTypes['serial']); + } + + const uniqueForSerial = Object.values( + prevSchema.tables[table.name].uniqueConstraints, + ).find((it) => it.columns[0] === column.name); + + suggestions.push( + `\n` + + withStyle.suggestion( + `We are suggesting to change ${ + chalk.blue( + column.name, + ) + } column in ${ + chalk.blueBright( + table.name, + ) + } table from serial to bigint unsigned\n\n${ + chalk.blueBright( + `bigint("${column.name}", { mode: "number", unsigned: true }).notNull().autoincrement().unique(${ + uniqueForSerial?.name ? `"${uniqueForSerial?.name}"` : '' + })`, + ) + }`, + ), + ); + } + } + } + + return suggestions; +} + // Intersect with prepareAnMigrate export const prepareSingleStorePush = async ( schemaPath: string | string[], @@ -546,6 +599,65 @@ export const prepareAndMigrateSqlite = async (config: GenerateConfig) => { } }; +export const prepareAndMigrateLibSQL = async (config: GenerateConfig) => { + const outFolder = config.out; + const schemaPath = config.schema; + + try { + assertV1OutFolder(outFolder); + + const { snapshots, journal } = prepareMigrationFolder(outFolder, 'sqlite'); + const { prev, cur, custom } = await prepareSqliteMigrationSnapshot( + snapshots, + schemaPath, + ); + + const validatedPrev = sqliteSchema.parse(prev); + const validatedCur = sqliteSchema.parse(cur); + + if (config.custom) { + writeResult({ + cur: custom, + sqlStatements: [], + journal, + outFolder, + name: config.name, + breakpoints: config.breakpoints, + bundle: config.bundle, + type: 'custom', + prefixMode: config.prefix, + }); + return; + } + + const squashedPrev = squashSqliteScheme(validatedPrev); + const squashedCur = squashSqliteScheme(validatedCur); + + const { sqlStatements, _meta } = await applyLibSQLSnapshotsDiff( + squashedPrev, + squashedCur, + tablesResolver, + columnsResolver, + validatedPrev, + validatedCur, + ); + + writeResult({ + cur, + sqlStatements, + journal, + _meta, + outFolder, + name: config.name, + breakpoints: config.breakpoints, + bundle: config.bundle, + prefixMode: config.prefix, + }); + } catch (e) { + console.error(e); + } +}; + export const prepareSQLitePush = async ( schemaPath: string | string[], snapshot: SQLiteSchema, @@ -577,6 +689,37 @@ export const prepareSQLitePush = async ( }; }; +export const prepareLibSQLPush = async ( + schemaPath: string | string[], + snapshot: SQLiteSchema, +) => { + const { prev, cur } = await prepareSQLiteDbPushSnapshot(snapshot, schemaPath); + + const validatedPrev = sqliteSchema.parse(prev); + const validatedCur = sqliteSchema.parse(cur); + + const squashedPrev = squashSqliteScheme(validatedPrev, 'push'); + const squashedCur = squashSqliteScheme(validatedCur, 'push'); + + const { sqlStatements, statements, _meta } = await applyLibSQLSnapshotsDiff( + squashedPrev, + squashedCur, + tablesResolver, + columnsResolver, + validatedPrev, + validatedCur, + 'push', + ); + + return { + sqlStatements, + statements, + squashedPrev, + squashedCur, + meta: _meta, + }; +}; + const freeeeeeze = (obj: any) => { Object.freeze(obj); for (let key in obj) { diff --git a/drizzle-kit/src/cli/commands/push.ts b/drizzle-kit/src/cli/commands/push.ts index d54f8c6eb..0cb3a8d28 100644 --- a/drizzle-kit/src/cli/commands/push.ts +++ b/drizzle-kit/src/cli/commands/push.ts @@ -2,11 +2,13 @@ import chalk from 'chalk'; import { render } from 'hanji'; import { fromJson } from '../../sqlgenerator'; import { Select } from '../selector-ui'; +import { LibSQLCredentials } from '../validations/libsql'; import type { MysqlCredentials } from '../validations/mysql'; import { withStyle } from '../validations/outputs'; import type { PostgresCredentials } from '../validations/postgres'; import { SingleStoreCredentials } from '../validations/singlestore'; import type { SqliteCredentials } from '../validations/sqlite'; +import { libSqlLogSuggestionsAndReturn } from './libSqlPushUtils'; import { filterStatements as mySqlFilterStatements, logSuggestionsAndReturn as mySqlLogSuggestionsAndReturn, @@ -77,7 +79,6 @@ export const mysqlPush = async ( if (verbose) { console.log(); - // console.log(chalk.gray('Verbose logs:')); console.log( withStyle.warning('You are about to execute current statements:'), ); @@ -439,8 +440,8 @@ export const sqlitePush = async ( } = await sqliteSuggestions( db, statements.statements, - statements.squashedCur, statements.squashedPrev, + statements.squashedCur, statements.meta!, ); @@ -520,10 +521,114 @@ export const sqlitePush = async ( await db.query('rollback'); process.exit(1); } - } else if (credentials.driver === 'turso') { - await db.batch!(statementsToExecute.map((it) => ({ query: it }))); } render(`[${chalk.green('✓')}] Changes applied`); } } }; + +export const libSQLPush = async ( + schemaPath: string | string[], + verbose: boolean, + strict: boolean, + credentials: LibSQLCredentials, + tablesFilter: string[], + force: boolean, +) => { + const { connectToLibSQL } = await import('../connections'); + const { sqlitePushIntrospect } = await import('./sqliteIntrospect'); + + const db = await connectToLibSQL(credentials); + const { schema } = await sqlitePushIntrospect(db, tablesFilter); + + const { prepareLibSQLPush } = await import('./migrate'); + + const statements = await prepareLibSQLPush(schemaPath, schema); + + if (statements.sqlStatements.length === 0) { + render(`\n[${chalk.blue('i')}] No changes detected`); + } else { + const { + shouldAskForApprove, + statementsToExecute, + columnsToRemove, + tablesToRemove, + tablesToTruncate, + infoToPrint, + } = await libSqlLogSuggestionsAndReturn( + db, + statements.statements, + statements.squashedPrev, + statements.squashedCur, + statements.meta!, + ); + + if (verbose && statementsToExecute.length > 0) { + console.log(); + console.log( + withStyle.warning('You are about to execute current statements:'), + ); + console.log(); + console.log(statementsToExecute.map((s) => chalk.blue(s)).join('\n')); + console.log(); + } + + if (!force && strict) { + if (!shouldAskForApprove) { + const { status, data } = await render( + new Select(['No, abort', `Yes, I want to execute all statements`]), + ); + if (data?.index === 0) { + render(`[${chalk.red('x')}] All changes were aborted`); + process.exit(0); + } + } + } + + if (!force && shouldAskForApprove) { + console.log(withStyle.warning('Found data-loss statements:')); + console.log(infoToPrint.join('\n')); + console.log(); + console.log( + chalk.red.bold( + 'THIS ACTION WILL CAUSE DATA LOSS AND CANNOT BE REVERTED\n', + ), + ); + + console.log(chalk.white('Do you still want to push changes?')); + + const { status, data } = await render( + new Select([ + 'No, abort', + `Yes, I want to${ + tablesToRemove.length > 0 + ? ` remove ${tablesToRemove.length} ${tablesToRemove.length > 1 ? 'tables' : 'table'},` + : ' ' + }${ + columnsToRemove.length > 0 + ? ` remove ${columnsToRemove.length} ${columnsToRemove.length > 1 ? 'columns' : 'column'},` + : ' ' + }${ + tablesToTruncate.length > 0 + ? ` truncate ${tablesToTruncate.length} ${tablesToTruncate.length > 1 ? 'tables' : 'table'}` + : '' + }` + .trimEnd() + .replace(/(^,)|(,$)/g, '') + .replace(/ +(?= )/g, ''), + ]), + ); + if (data?.index === 0) { + render(`[${chalk.red('x')}] All changes were aborted`); + process.exit(0); + } + } + + if (statementsToExecute.length === 0) { + render(`\n[${chalk.blue('i')}] No changes detected`); + } else { + await db.batchWithPragma!(statementsToExecute); + render(`[${chalk.green('✓')}] Changes applied`); + } + } +}; diff --git a/drizzle-kit/src/cli/commands/singlestoreUp.ts b/drizzle-kit/src/cli/commands/singlestoreUp.ts new file mode 100644 index 000000000..dc5004ed0 --- /dev/null +++ b/drizzle-kit/src/cli/commands/singlestoreUp.ts @@ -0,0 +1 @@ +export const upSinglestoreHandler = (out: string) => {}; diff --git a/drizzle-kit/src/cli/commands/sqlitePushUtils.ts b/drizzle-kit/src/cli/commands/sqlitePushUtils.ts index 451f035a7..bcc2d19db 100644 --- a/drizzle-kit/src/cli/commands/sqlitePushUtils.ts +++ b/drizzle-kit/src/cli/commands/sqlitePushUtils.ts @@ -10,7 +10,7 @@ import { } from '../../sqlgenerator'; import type { JsonStatement } from '../../jsonStatements'; -import type { DB, SQLiteDB } from '../../utils'; +import { findAddedAndRemoved, type SQLiteDB } from '../../utils'; export const _moveDataStatements = ( tableName: string, @@ -19,16 +19,7 @@ export const _moveDataStatements = ( ) => { const statements: string[] = []; - // rename table to __old_${tablename} - statements.push( - new SqliteRenameTableConvertor().convert({ - type: 'rename_table', - tableNameFrom: tableName, - tableNameTo: `__old_push_${tableName}`, - fromSchema: '', - toSchema: '', - }), - ); + const newTableName = `__new_${tableName}`; // create table statement from a new json2 with proper name const tableColumns = Object.values(json.tables[tableName].columns); @@ -39,10 +30,11 @@ export const _moveDataStatements = ( const fks = referenceData.map((it) => SQLiteSquasher.unsquashPushFK(it)); + // create new table statements.push( new SQLiteCreateTableConvertor().convert({ type: 'sqlite_create_table', - tableName: tableName, + tableName: newTableName, columns: tableColumns, referenceData: fks, compositePKs, @@ -51,19 +43,38 @@ export const _moveDataStatements = ( // move data if (!dataLoss) { + const columns = Object.keys(json.tables[tableName].columns).map( + (c) => `"${c}"`, + ); + statements.push( - `INSERT INTO "${tableName}" SELECT * FROM "__old_push_${tableName}";`, + `INSERT INTO \`${newTableName}\`(${ + columns.join( + ', ', + ) + }) SELECT ${columns.join(', ')} FROM \`${tableName}\`;`, ); } - // drop table with name __old_${tablename} + statements.push( new SQLiteDropTableConvertor().convert({ type: 'drop_table', - tableName: `__old_push_${tableName}`, + tableName: tableName, schema: '', }), ); + // rename table + statements.push( + new SqliteRenameTableConvertor().convert({ + fromSchema: '', + tableNameFrom: newTableName, + tableNameTo: tableName, + toSchema: '', + type: 'rename_table', + }), + ); + for (const idx of Object.values(json.tables[tableName].indexes)) { statements.push( new CreateSqliteIndexConvertor().convert({ @@ -120,8 +131,6 @@ export const logSuggestionsAndReturn = async ( const schemasToRemove: string[] = []; const tablesToTruncate: string[] = []; - const tablesContext: Record = {}; - for (const statement of statements) { if (statement.type === 'drop_table') { const res = await connection.query<{ count: string }>( @@ -139,248 +148,159 @@ export const logSuggestionsAndReturn = async ( tablesToRemove.push(statement.tableName); shouldAskForApprove = true; } - const stmnt = fromJson([statement], 'sqlite')[0]; - statementsToExecute.push(stmnt); - } else if (statement.type === 'alter_table_drop_column') { - const newTableName = getOldTableName(statement.tableName, meta); - const columnIsPartOfPk = Object.values( - json1.tables[newTableName].compositePrimaryKeys, - ).find((c) => SQLiteSquasher.unsquashPK(c).includes(statement.columnName)); - - const columnIsPartOfIndex = Object.values( - json1.tables[newTableName].indexes, - ).find((c) => SQLiteSquasher.unsquashIdx(c).columns.includes(statement.columnName)); - - const columnIsPk = json1.tables[newTableName].columns[statement.columnName].primaryKey; - - const columnIsPartOfFk = Object.values( - json1.tables[newTableName].foreignKeys, - ).find((t) => - SQLiteSquasher.unsquashPushFK(t).columnsFrom.includes( - statement.columnName, - ) + const fromJsonStatement = fromJson([statement], 'sqlite', 'push'); + statementsToExecute.push( + ...(Array.isArray(fromJsonStatement) ? fromJsonStatement : [fromJsonStatement]), ); + } else if (statement.type === 'alter_table_drop_column') { + const tableName = statement.tableName; + const columnName = statement.columnName; const res = await connection.query<{ count: string }>( - `select count(*) as count from \`${newTableName}\``, + `select count(\`${tableName}\`.\`${columnName}\`) as count from \`${tableName}\``, ); const count = Number(res[0].count); if (count > 0) { infoToPrint.push( `· You're about to delete ${ chalk.underline( - statement.columnName, + columnName, ) - } column in ${newTableName} table with ${count} items`, + } column in ${tableName} table with ${count} items`, ); - columnsToRemove.push(`${newTableName}_${statement.columnName}`); + columnsToRemove.push(`${tableName}_${statement.columnName}`); shouldAskForApprove = true; } - if ( - columnIsPk - || columnIsPartOfPk - || columnIsPartOfIndex - || columnIsPartOfFk - ) { - tablesContext[newTableName] = [ - ..._moveDataStatements(statement.tableName, json2, true), - ]; - // check table that have fk to this table - - const tablesReferncingCurrent: string[] = []; - - for (const table of Object.values(json1.tables)) { - const tablesRefs = Object.values(json1.tables[table.name].foreignKeys) - .filter( - (t) => SQLiteSquasher.unsquashPushFK(t).tableTo === newTableName, + const fromJsonStatement = fromJson([statement], 'sqlite', 'push'); + statementsToExecute.push( + ...(Array.isArray(fromJsonStatement) ? fromJsonStatement : [fromJsonStatement]), + ); + } else if ( + statement.type === 'sqlite_alter_table_add_column' + && (statement.column.notNull && !statement.column.default) + ) { + const tableName = statement.tableName; + const columnName = statement.column.name; + const res = await connection.query<{ count: string }>( + `select count(*) as count from \`${tableName}\``, + ); + const count = Number(res[0].count); + if (count > 0) { + infoToPrint.push( + `· You're about to add not-null ${ + chalk.underline( + columnName, ) - .map((t) => SQLiteSquasher.unsquashPushFK(t).tableFrom); - - tablesReferncingCurrent.push(...tablesRefs); - } - - const uniqueTableRefs = [...new Set(tablesReferncingCurrent)]; - - for (const table of uniqueTableRefs) { - if (typeof tablesContext[table] === 'undefined') { - tablesContext[table] = [..._moveDataStatements(table, json2)]; - } - } - } else { - if (typeof tablesContext[newTableName] === 'undefined') { - const stmnt = fromJson([statement], 'sqlite')[0]; - statementsToExecute.push(stmnt); - } - } - } else if (statement.type === 'sqlite_alter_table_add_column') { - const newTableName = getOldTableName(statement.tableName, meta); - if (statement.column.notNull && !statement.column.default) { - const res = await connection.query<{ count: string }>( - `select count(*) as count from \`${newTableName}\``, + } column without default value, which contains ${count} items`, ); - const count = Number(res[0].count); - if (count > 0) { - infoToPrint.push( - `· You're about to add not-null ${ - chalk.underline( - statement.column.name, - ) - } column without default value, which contains ${count} items`, - ); - tablesToTruncate.push(newTableName); - statementsToExecute.push(`delete from ${newTableName};`); + tablesToTruncate.push(tableName); + statementsToExecute.push(`delete from ${tableName};`); - shouldAskForApprove = true; - } + shouldAskForApprove = true; } - if (statement.column.primaryKey) { - tablesContext[newTableName] = [ - ..._moveDataStatements(statement.tableName, json2, true), - ]; - const tablesReferncingCurrent: string[] = []; - - for (const table of Object.values(json1.tables)) { - const tablesRefs = Object.values(json1.tables[table.name].foreignKeys) - .filter( - (t) => SQLiteSquasher.unsquashPushFK(t).tableTo === newTableName, - ) - .map((t) => SQLiteSquasher.unsquashPushFK(t).tableFrom); - tablesReferncingCurrent.push(...tablesRefs); - } + const fromJsonStatement = fromJson([statement], 'sqlite', 'push'); + statementsToExecute.push( + ...(Array.isArray(fromJsonStatement) ? fromJsonStatement : [fromJsonStatement]), + ); + } else if (statement.type === 'recreate_table') { + const tableName = statement.tableName; + const oldTableName = getOldTableName(tableName, meta); - const uniqueTableRefs = [...new Set(tablesReferncingCurrent)]; + let dataLoss = false; - for (const table of uniqueTableRefs) { - if (typeof tablesContext[table] === 'undefined') { - tablesContext[table] = [..._moveDataStatements(table, json2)]; - } - } - } else { - if (typeof tablesContext[newTableName] === 'undefined') { - const stmnt = fromJson([statement], 'sqlite')[0]; - statementsToExecute.push(stmnt); - } - } - } else if ( - statement.type === 'alter_table_alter_column_set_type' - || statement.type === 'alter_table_alter_column_set_default' - || statement.type === 'alter_table_alter_column_drop_default' - || statement.type === 'alter_table_alter_column_set_notnull' - || statement.type === 'alter_table_alter_column_drop_notnull' - || statement.type === 'alter_table_alter_column_drop_autoincrement' - || statement.type === 'alter_table_alter_column_set_autoincrement' - || statement.type === 'alter_table_alter_column_drop_pk' - || statement.type === 'alter_table_alter_column_set_pk' - ) { - if ( - !( - statement.type === 'alter_table_alter_column_set_notnull' - && statement.columnPk - ) - ) { - const newTableName = getOldTableName(statement.tableName, meta); - if ( - statement.type === 'alter_table_alter_column_set_notnull' - && typeof statement.columnDefault === 'undefined' - ) { + const prevColumnNames = Object.keys(json1.tables[oldTableName].columns); + const currentColumnNames = Object.keys(json2.tables[tableName].columns); + const { removedColumns, addedColumns } = findAddedAndRemoved( + prevColumnNames, + currentColumnNames, + ); + + if (removedColumns.length) { + for (const removedColumn of removedColumns) { const res = await connection.query<{ count: string }>( - `select count(*) as count from \`${newTableName}\``, + `select count(\`${tableName}\`.\`${removedColumn}\`) as count from \`${tableName}\``, ); + const count = Number(res[0].count); if (count > 0) { infoToPrint.push( - `· You're about to add not-null constraint to ${ + `· You're about to delete ${ chalk.underline( - statement.columnName, + removedColumn, ) - } column without default value, which contains ${count} items`, + } column in ${tableName} table with ${count} items`, ); - - tablesToTruncate.push(newTableName); + columnsToRemove.push(removedColumn); shouldAskForApprove = true; } - tablesContext[newTableName] = _moveDataStatements( - statement.tableName, - json1, - true, + } + } + + if (addedColumns.length) { + for (const addedColumn of addedColumns) { + const [res] = await connection.query<{ count: string }>( + `select count(*) as count from \`${tableName}\``, ); - } else { - if (typeof tablesContext[newTableName] === 'undefined') { - tablesContext[newTableName] = _moveDataStatements( - statement.tableName, - json1, + + const columnConf = json2.tables[tableName].columns[addedColumn]; + + const count = Number(res.count); + if (count > 0 && columnConf.notNull && !columnConf.default) { + dataLoss = true; + infoToPrint.push( + `· You're about to add not-null ${ + chalk.underline( + addedColumn, + ) + } column without default value to table, which contains ${count} items`, ); + shouldAskForApprove = true; + tablesToTruncate.push(tableName); + + statementsToExecute.push(`DELETE FROM \`${tableName}\`;`); } } + } - const tablesReferncingCurrent: string[] = []; + // check if some tables referencing current for pragma + const tablesReferencingCurrent: string[] = []; - for (const table of Object.values(json1.tables)) { - const tablesRefs = Object.values(json1.tables[table.name].foreignKeys) - .filter( - (t) => SQLiteSquasher.unsquashPushFK(t).tableTo === newTableName, - ) - .map((t) => { - return getNewTableName( - SQLiteSquasher.unsquashPushFK(t).tableFrom, - meta, - ); - }); - - tablesReferncingCurrent.push(...tablesRefs); - } + for (const table of Object.values(json2.tables)) { + const tablesRefs = Object.values(json2.tables[table.name].foreignKeys) + .filter((t) => SQLiteSquasher.unsquashPushFK(t).tableTo === tableName) + .map((it) => SQLiteSquasher.unsquashPushFK(it).tableFrom); - const uniqueTableRefs = [...new Set(tablesReferncingCurrent)]; + tablesReferencingCurrent.push(...tablesRefs); + } - for (const table of uniqueTableRefs) { - if (typeof tablesContext[table] === 'undefined') { - tablesContext[table] = [..._moveDataStatements(table, json1)]; - } - } + if (!tablesReferencingCurrent.length) { + statementsToExecute.push(..._moveDataStatements(tableName, json2, dataLoss)); + continue; } - } else if ( - statement.type === 'create_reference' - || statement.type === 'delete_reference' - || statement.type === 'alter_reference' - ) { - const fk = SQLiteSquasher.unsquashPushFK(statement.data); - if (typeof tablesContext[statement.tableName] === 'undefined') { - tablesContext[statement.tableName] = _moveDataStatements( - statement.tableName, - json2, - ); + const [{ foreign_keys: pragmaState }] = await connection.query<{ + foreign_keys: number; + }>(`PRAGMA foreign_keys;`); + + if (pragmaState) { + statementsToExecute.push(`PRAGMA foreign_keys=OFF;`); } - } else if ( - statement.type === 'create_composite_pk' - || statement.type === 'alter_composite_pk' - || statement.type === 'delete_composite_pk' - || statement.type === 'create_unique_constraint' - || statement.type === 'delete_unique_constraint' - ) { - const newTableName = getOldTableName(statement.tableName, meta); - if (typeof tablesContext[newTableName] === 'undefined') { - tablesContext[newTableName] = _moveDataStatements( - statement.tableName, - json2, - ); + statementsToExecute.push(..._moveDataStatements(tableName, json2, dataLoss)); + if (pragmaState) { + statementsToExecute.push(`PRAGMA foreign_keys=ON;`); } } else { - const stmnt = fromJson([statement], 'sqlite'); - if (typeof stmnt !== 'undefined') { - statementsToExecute.push(...stmnt); - } + const fromJsonStatement = fromJson([statement], 'sqlite', 'push'); + statementsToExecute.push( + ...(Array.isArray(fromJsonStatement) ? fromJsonStatement : [fromJsonStatement]), + ); } } - for (const context of Object.values(tablesContext)) { - statementsToExecute.push(...context); - } - return { statementsToExecute, shouldAskForApprove, diff --git a/drizzle-kit/src/cli/commands/utils.ts b/drizzle-kit/src/cli/commands/utils.ts index 8f51d0c18..25e80ed4d 100644 --- a/drizzle-kit/src/cli/commands/utils.ts +++ b/drizzle-kit/src/cli/commands/utils.ts @@ -16,6 +16,11 @@ import { Prefix, wrapParam, } from '../validations/common'; +import { + LibSQLCredentials, + libSQLCredentials, + printConfigConnectionIssues as printIssuesLibSql, +} from '../validations/libsql'; import { MysqlCredentials, mysqlCredentials, @@ -216,6 +221,14 @@ export const preparePushConfig = async ( dialect: 'sqlite'; credentials: SqliteCredentials; } + | { + dialect: 'turso'; + credentials: LibSQLCredentials; + } + | { + dialect: 'singlestore'; + credentials: SingleStoreCredentials; + } | { dialect: 'singlestore'; credentials: SingleStoreCredentials; @@ -321,7 +334,7 @@ export const preparePushConfig = async ( if (config.dialect === 'singlestore') { const parsed = singlestoreCredentials.safeParse(config); if (!parsed.success) { - printIssuesPg(config); + printIssuesSingleStore(config); process.exit(1); } @@ -355,6 +368,24 @@ export const preparePushConfig = async ( }; } + if (config.dialect === 'turso') { + const parsed = libSQLCredentials.safeParse(config); + if (!parsed.success) { + printIssuesSqlite(config, 'pull'); + process.exit(1); + } + return { + dialect: 'turso', + schemaPath: config.schema, + strict: config.strict ?? false, + verbose: config.verbose ?? false, + force: (options.force as boolean) ?? false, + credentials: parsed.data, + tablesFilter, + schemasFilter, + }; + } + assertUnreachable(config.dialect); }; @@ -379,6 +410,14 @@ export const preparePullConfig = async ( dialect: 'singlestore'; credentials: SingleStoreCredentials; } + | { + dialect: 'turso'; + credentials: LibSQLCredentials; + } + | { + dialect: 'singlestore'; + credentials: SingleStoreCredentials; + } ) & { out: string; breakpoints: boolean; @@ -469,7 +508,7 @@ export const preparePullConfig = async ( if (dialect === 'singlestore') { const parsed = singlestoreCredentials.safeParse(config); if (!parsed.success) { - printIssuesPg(config); + printIssuesSingleStore(config); process.exit(1); } @@ -477,11 +516,11 @@ export const preparePullConfig = async ( dialect: 'singlestore', out: config.out, breakpoints: config.breakpoints, - casing: config.introspectCasing, + casing: config.casing, credentials: parsed.data, tablesFilter, schemasFilter, - prefix: config.database?.prefix || 'index', + prefix: config.migrations?.prefix || 'index', }; } @@ -503,6 +542,24 @@ export const preparePullConfig = async ( }; } + if (dialect === 'turso') { + const parsed = libSQLCredentials.safeParse(config); + if (!parsed.success) { + printIssuesLibSql(config, 'pull'); + process.exit(1); + } + return { + dialect, + out: config.out, + breakpoints: config.breakpoints, + casing: config.casing, + credentials: parsed.data, + tablesFilter, + schemasFilter, + prefix: config.migrations?.prefix || 'index', + }; + } + assertUnreachable(dialect); }; @@ -560,7 +617,7 @@ export const prepareStudioConfig = async (options: Record) => { if (dialect === 'singlestore') { const parsed = singlestoreCredentials.safeParse(flattened); if (!parsed.success) { - printIssuesPg(flattened as Record); + printIssuesSingleStore(flattened as Record); process.exit(1); } const credentials = parsed.data; @@ -589,6 +646,22 @@ export const prepareStudioConfig = async (options: Record) => { }; } + if (dialect === 'turso') { + const parsed = libSQLCredentials.safeParse(flattened); + if (!parsed.success) { + printIssuesLibSql(flattened as Record, 'studio'); + process.exit(1); + } + const credentials = parsed.data; + return { + dialect, + schema, + host, + port, + credentials, + }; + } + assertUnreachable(dialect); }; @@ -674,6 +747,21 @@ export const prepareMigrateConfig = async (configPath: string | undefined) => { table, }; } + if (dialect === 'turso') { + const parsed = libSQLCredentials.safeParse(flattened); + if (!parsed.success) { + printIssuesLibSql(flattened as Record, 'migrate'); + process.exit(1); + } + const credentials = parsed.data; + return { + dialect, + out, + credentials, + schema, + table, + }; + } assertUnreachable(dialect); }; diff --git a/drizzle-kit/src/cli/connections.ts b/drizzle-kit/src/cli/connections.ts index 3357bf146..fccc7def8 100644 --- a/drizzle-kit/src/cli/connections.ts +++ b/drizzle-kit/src/cli/connections.ts @@ -5,8 +5,17 @@ import fetch from 'node-fetch'; import ws from 'ws'; import { assertUnreachable } from '../global'; import type { ProxyParams } from '../serializer/studio'; -import { type DB, normalisePGliteUrl, normaliseSQLiteUrl, type Proxy, type SQLiteDB, type SqliteProxy } from '../utils'; +import { + type DB, + LibSQLDB, + normalisePGliteUrl, + normaliseSQLiteUrl, + type Proxy, + type SQLiteDB, + type SqliteProxy, +} from '../utils'; import { assertPackages, checkPackage } from './utils'; +import { LibSQLCredentials } from './validations/libsql'; import type { MysqlCredentials } from './validations/mysql'; import { withStyle } from './validations/outputs'; import type { PostgresCredentials } from './validations/postgres'; @@ -483,56 +492,7 @@ export const connectToSQLite = async ( > => { if ('driver' in credentials) { const { driver } = credentials; - if (driver === 'turso') { - assertPackages('@libsql/client'); - const { createClient } = await import('@libsql/client'); - const { drizzle } = await import('drizzle-orm/libsql'); - const { migrate } = await import('drizzle-orm/libsql/migrator'); - - const client = createClient({ - url: credentials.url, - authToken: credentials.authToken, - }); - - const drzl = drizzle(client); - const migrateFn = async (config: MigrationConfig) => { - return migrate(drzl, config); - }; - - const db: SQLiteDB = { - query: async (sql: string, params?: any[]) => { - const res = await client.execute({ sql, args: params || [] }); - return res.rows as T[]; - }, - run: async (query: string) => { - await client.execute(query); - }, - batch: async ( - queries: { query: string; values?: any[] | undefined }[], - ) => { - await client.batch( - queries.map((it) => ({ sql: it.query, args: it.values ?? [] })), - ); - }, - }; - const proxy: SqliteProxy = { - proxy: async (params: ProxyParams) => { - const preparedParams = prepareSqliteParams(params.params); - const result = await client.execute({ - sql: params.sql, - args: preparedParams, - }); - - if (params.mode === 'array') { - return result.rows.map((row) => Object.values(row)); - } else { - return result.rows; - } - }, - }; - - return { ...db, ...proxy, migrate: migrateFn }; - } else if (driver === 'd1-http') { + if (driver === 'd1-http') { const { drizzle } = await import('drizzle-orm/sqlite-proxy'); const { migrate } = await import('drizzle-orm/sqlite-proxy/migrator'); @@ -709,12 +669,70 @@ export const connectToSQLite = async ( }; return { ...db, ...proxy, migrate: migrateFn }; } + console.log( "Please install either 'better-sqlite3' or '@libsql/client' for Drizzle Kit to connect to SQLite databases", ); process.exit(1); }; +export const connectToLibSQL = async (credentials: LibSQLCredentials): Promise< + & LibSQLDB + & SqliteProxy + & { migrate: (config: MigrationConfig) => Promise } +> => { + if (await checkPackage('@libsql/client')) { + const { createClient } = await import('@libsql/client'); + const { drizzle } = await import('drizzle-orm/libsql'); + const { migrate } = await import('drizzle-orm/libsql/migrator'); + + const client = createClient({ + url: normaliseSQLiteUrl(credentials.url, 'libsql'), + authToken: credentials.authToken, + }); + const drzl = drizzle(client); + const migrateFn = async (config: MigrationConfig) => { + return migrate(drzl, config); + }; + + const db: LibSQLDB = { + query: async (sql: string, params?: any[]) => { + const res = await client.execute({ sql, args: params || [] }); + return res.rows as T[]; + }, + run: async (query: string) => { + await client.execute(query); + }, + batchWithPragma: async (queries: string[]) => { + await client.migrate(queries); + }, + }; + + const proxy: SqliteProxy = { + proxy: async (params: ProxyParams) => { + const preparedParams = prepareSqliteParams(params.params); + const result = await client.execute({ + sql: params.sql, + args: preparedParams, + }); + + if (params.mode === 'array') { + return result.rows.map((row) => Object.values(row)); + } else { + return result.rows; + } + }, + }; + + return { ...db, ...proxy, migrate: migrateFn }; + } + + console.log( + "Please install '@libsql/client' for Drizzle Kit to connect to LibSQL databases", + ); + process.exit(1); +}; + const parseSingleStoreCredentials = (credentials: SingleStoreCredentials) => { if ('url' in credentials) { const url = credentials.url; diff --git a/drizzle-kit/src/cli/schema.ts b/drizzle-kit/src/cli/schema.ts index 4da8af0ac..6b7bcb560 100644 --- a/drizzle-kit/src/cli/schema.ts +++ b/drizzle-kit/src/cli/schema.ts @@ -1,11 +1,20 @@ +import { boolean, command, number, string } from '@drizzle-team/brocli'; import chalk from 'chalk'; -import { checkHandler } from './commands/check'; -import { assertOrmCoreVersion, assertPackages, assertStudioNodeVersion, ormVersionGt } from './utils'; +import 'dotenv/config'; +import { mkdirSync } from 'fs'; +import { renderWithTask } from 'hanji'; +import { dialects } from 'src/schemaValidator'; import '../@types/utils'; +import { assertUnreachable } from '../global'; +import { drizzleForLibSQL, drizzleForSingleStore, prepareSingleStoreSchema, type Setup } from '../serializer/studio'; import { assertV1OutFolder } from '../utils'; +import { certs } from '../utils/certs'; +import { checkHandler } from './commands/check'; import { dropMigration } from './commands/drop'; +import { prepareAndMigrateSingleStore } from './commands/migrate'; import { upMysqlHandler } from './commands/mysqlUp'; import { upPgHandler } from './commands/pgUp'; +import { upSinglestoreHandler } from './commands/singlestoreUp'; import { upSqliteHandler } from './commands/sqliteUp'; import { prepareCheckParams, @@ -16,21 +25,14 @@ import { preparePushConfig, prepareStudioConfig, } from './commands/utils'; +import { assertOrmCoreVersion, assertPackages, assertStudioNodeVersion, ormVersionGt } from './utils'; import { assertCollisions, drivers, prefixes } from './validations/common'; import { withStyle } from './validations/outputs'; -import 'dotenv/config'; -import { boolean, command, number, string } from '@drizzle-team/brocli'; -import { mkdirSync } from 'fs'; -import { renderWithTask } from 'hanji'; -import { dialects } from 'src/schemaValidator'; -import { assertUnreachable } from '../global'; -import type { Setup } from '../serializer/studio'; -import { certs } from '../utils/certs'; import { grey, MigrateProgress } from './views'; const optionDialect = string('dialect') .enum(...dialects) - .desc(`Database dialect: 'postgresql', 'mysql' or 'sqlite'`); + .desc(`Database dialect: 'postgresql', 'mysql', 'sqlite' or 'turso'`); const optionOut = string().desc("Output folder, 'drizzle' by default"); const optionConfig = string().desc('Path to drizzle config file'); const optionBreakpoints = boolean().desc( @@ -77,6 +79,7 @@ export const generate = command({ prepareAndMigratePg, prepareAndMigrateMysql, prepareAndMigrateSqlite, + prepareAndMigrateLibSQL, } = await import('./commands/migrate'); const dialect = opts.dialect; @@ -86,6 +89,10 @@ export const generate = command({ await prepareAndMigrateMysql(opts); } else if (dialect === 'sqlite') { await prepareAndMigrateSqlite(opts); + } else if (dialect === 'turso') { + await prepareAndMigrateLibSQL(opts); + } else if (dialect === 'singlestore') { + await prepareAndMigrateSingleStore(opts); } else { assertUnreachable(dialect); } @@ -159,6 +166,28 @@ export const migrate = command({ migrationsSchema: schema, }), ); + } else if (dialect === 'turso') { + const { connectToLibSQL } = await import('./connections'); + const { migrate } = await connectToLibSQL(credentials); + await renderWithTask( + new MigrateProgress(), + migrate({ + migrationsFolder: opts.out, + migrationsTable: table, + migrationsSchema: schema, + }), + ); + } else if (dialect === 'singlestore') { + const { connectToSingleStore } = await import('./connections'); + const { migrate } = await connectToSingleStore(credentials); + await renderWithTask( + new MigrateProgress(), + migrate({ + migrationsFolder: out, + migrationsTable: table, + migrationsSchema: schema, + }), + ); } else { assertUnreachable(dialect); } @@ -304,6 +333,26 @@ export const push = command({ tablesFilter, force, ); + } else if (dialect === 'turso') { + const { libSQLPush } = await import('./commands/push'); + await libSQLPush( + schemaPath, + verbose, + strict, + credentials, + tablesFilter, + force, + ); + } else if (dialect === 'singlestore') { + const { singlestorePush } = await import('./commands/push'); + await singlestorePush( + schemaPath, + credentials, + tablesFilter, + strict, + verbose, + force, + ); } else { assertUnreachable(dialect); } @@ -359,7 +408,11 @@ export const up = command({ upMysqlHandler(out); } - if (dialect === 'sqlite') { + if (dialect === 'singlestore') { + upSinglestoreHandler(out); + } + + if (dialect === 'sqlite' || dialect === 'turso') { upSqliteHandler(out); } }, @@ -483,6 +536,26 @@ export const pull = command({ tablesFilter, prefix, ); + } else if (dialect === 'turso') { + const { introspectLibSQL } = await import('./commands/introspect'); + await introspectLibSQL( + casing, + out, + breakpoints, + credentials, + tablesFilter, + prefix, + ); + } else if (dialect === 'singlestore') { + const { introspectSingleStore } = await import('./commands/introspect'); + await introspectSingleStore( + casing, + out, + breakpoints, + credentials, + tablesFilter, + prefix, + ); } else { assertUnreachable(dialect); } @@ -583,6 +656,16 @@ export const studio = command({ ? await prepareSQLiteSchema(schemaPath) : { schema: {}, relations: {}, files: [] }; setup = await drizzleForSQLite(credentials, schema, relations, files); + } else if (dialect === 'turso') { + const { schema, relations, files } = schemaPath + ? await prepareSQLiteSchema(schemaPath) + : { schema: {}, relations: {}, files: [] }; + setup = await drizzleForLibSQL(credentials, schema, relations, files); + } else if (dialect === 'singlestore') { + const { schema, relations, files } = schemaPath + ? await prepareSingleStoreSchema(schemaPath) + : { schema: {}, relations: {}, files: [] }; + setup = await drizzleForSingleStore(credentials, schema, relations, files); } else { assertUnreachable(dialect); } diff --git a/drizzle-kit/src/cli/utils.ts b/drizzle-kit/src/cli/utils.ts index f7e7a2ae9..0a5d7862e 100644 --- a/drizzle-kit/src/cli/utils.ts +++ b/drizzle-kit/src/cli/utils.ts @@ -74,7 +74,7 @@ export const assertEitherPackage = async ( process.exit(1); }; -const requiredApiVersion = 7; +const requiredApiVersion = 8; export const assertOrmCoreVersion = async () => { try { const { compatibilityVersion } = await import('drizzle-orm/version'); diff --git a/drizzle-kit/src/cli/validations/common.ts b/drizzle-kit/src/cli/validations/common.ts index a7307f4d6..3a2701e37 100644 --- a/drizzle-kit/src/cli/validations/common.ts +++ b/drizzle-kit/src/cli/validations/common.ts @@ -61,7 +61,6 @@ export const assertCollisions = < }; export const sqliteDriversLiterals = [ - literal('turso'), literal('d1-http'), literal('expo'), ] as const; @@ -156,7 +155,7 @@ export const configPushSchema = object({ }); export type CliConfig = TypeOf; -export const drivers = ['turso', 'd1-http', 'expo', 'aws-data-api', 'pglite'] as const; +export const drivers = ['d1-http', 'expo', 'aws-data-api', 'pglite'] as const; export type Driver = (typeof drivers)[number]; const _: Driver = '' as TypeOf; diff --git a/drizzle-kit/src/cli/validations/libsql.ts b/drizzle-kit/src/cli/validations/libsql.ts new file mode 100644 index 000000000..a9b03c168 --- /dev/null +++ b/drizzle-kit/src/cli/validations/libsql.ts @@ -0,0 +1,27 @@ +import { softAssertUnreachable } from 'src/global'; +import { object, string, TypeOf } from 'zod'; +import { error } from '../views'; +import { wrapParam } from './common'; + +export const libSQLCredentials = object({ + url: string().min(1), + authToken: string().min(1).optional(), +}); + +export type LibSQLCredentials = { + url: string; + authToken?: string; +}; + +const _: LibSQLCredentials = {} as TypeOf; + +export const printConfigConnectionIssues = ( + options: Record, + command: 'generate' | 'migrate' | 'push' | 'pull' | 'studio', +) => { + let text = `Please provide required params for 'turso' dialect:\n`; + console.log(error(text)); + console.log(wrapParam('url', options.url)); + console.log(wrapParam('authToken', options.authToken, true, 'secret')); + process.exit(1); +}; diff --git a/drizzle-kit/src/cli/validations/sqlite.ts b/drizzle-kit/src/cli/validations/sqlite.ts index b6ad062d5..54178fd4a 100644 --- a/drizzle-kit/src/cli/validations/sqlite.ts +++ b/drizzle-kit/src/cli/validations/sqlite.ts @@ -25,11 +25,6 @@ export const sqliteCredentials = union([ ]); export type SqliteCredentials = - | { - driver: 'turso'; - url: string; - authToken: string; - } | { driver: 'd1-http'; accountId: string; diff --git a/drizzle-kit/src/index.ts b/drizzle-kit/src/index.ts index 5da4cb7b4..dc0c6274c 100644 --- a/drizzle-kit/src/index.ts +++ b/drizzle-kit/src/index.ts @@ -128,8 +128,7 @@ export type Config = } & ( | { - dialect: Verify; - driver: Verify; + dialect: Verify; dbCredentials: { url: string; authToken?: string; diff --git a/drizzle-kit/src/jsonStatements.ts b/drizzle-kit/src/jsonStatements.ts index c099a9a69..b27785d9a 100644 --- a/drizzle-kit/src/jsonStatements.ts +++ b/drizzle-kit/src/jsonStatements.ts @@ -1,11 +1,16 @@ import chalk from 'chalk'; -import { table } from 'console'; +import { getNewTableName } from './cli/commands/sqlitePushUtils'; import { warning } from './cli/views'; -import { CommonSquashedSchema, Dialect } from './schemaValidator'; +import { CommonSquashedSchema } from './schemaValidator'; import { MySqlKitInternals, MySqlSchema, MySqlSquasher } from './serializer/mysqlSchema'; import { Index, PgSchema, PgSquasher } from './serializer/pgSchema'; import { SingleStoreKitInternals, SingleStoreSchema, SingleStoreSquasher } from './serializer/singlestoreSchema'; -import { SQLiteKitInternals, SQLiteSquasher } from './serializer/sqliteSchema'; +import { + SQLiteKitInternals, + SQLiteSchemaInternal, + SQLiteSchemaSquashed, + SQLiteSquasher, +} from './serializer/sqliteSchema'; import { AlteredColumn, Column, Sequence, Table } from './snapshotsDiffer'; export interface JsonSqliteCreateTableStatement { @@ -36,6 +41,23 @@ export interface JsonCreateTableStatement { internals?: MySqlKitInternals | SingleStoreKitInternals; } +export interface JsonRecreateTableStatement { + type: 'recreate_table'; + tableName: string; + columns: Column[]; + referenceData: { + name: string; + tableFrom: string; + columnsFrom: string[]; + tableTo: string; + columnsTo: string[]; + onUpdate?: string | undefined; + onDelete?: string | undefined; + }[]; + compositePKs: string[][]; + uniqueConstraints?: string[]; +} + export interface JsonDropTableStatement { type: 'drop_table'; tableName: string; @@ -174,6 +196,10 @@ export interface JsonReferenceStatement { data: string; schema: string; tableName: string; + isMulticolumn?: boolean; + columnNotNull?: boolean; + columnDefault?: string; + columnType?: string; // fromTable: string; // fromColumns: string[]; // toTable: string; @@ -520,6 +546,7 @@ export type JsonAlterColumnStatement = | JsonAlterColumnDropIdentityStatement; export type JsonStatement = + | JsonRecreateTableStatement | JsonAlterColumnStatement | JsonCreateTableStatement | JsonDropTableStatement @@ -2023,6 +2050,55 @@ export const prepareSqliteAlterColumns = ( `${tableName}_${columnName}` ]; + if (column.autoincrement?.type === 'added') { + statements.push({ + type: 'alter_table_alter_column_set_autoincrement', + tableName, + columnName, + schema, + newDataType: columnType, + columnDefault, + columnOnUpdate, + columnNotNull, + columnAutoIncrement, + columnPk, + }); + } + + if (column.autoincrement?.type === 'changed') { + const type = column.autoincrement.new + ? 'alter_table_alter_column_set_autoincrement' + : 'alter_table_alter_column_drop_autoincrement'; + + statements.push({ + type, + tableName, + columnName, + schema, + newDataType: columnType, + columnDefault, + columnOnUpdate, + columnNotNull, + columnAutoIncrement, + columnPk, + }); + } + + if (column.autoincrement?.type === 'deleted') { + statements.push({ + type: 'alter_table_alter_column_drop_autoincrement', + tableName, + columnName, + schema, + newDataType: columnType, + columnDefault, + columnOnUpdate, + columnNotNull, + columnAutoIncrement, + columnPk, + }); + } + if (typeof column.name !== 'string') { statements.push({ type: 'alter_table_rename_column', @@ -2330,6 +2406,54 @@ export const prepareCreateReferencesJson = ( }; }); }; +export const prepareLibSQLCreateReferencesJson = ( + tableName: string, + schema: string, + foreignKeys: Record, + json2: SQLiteSchemaSquashed, + action?: 'push', +): JsonCreateReferenceStatement[] => { + return Object.values(foreignKeys).map((fkData) => { + const { columnsFrom, tableFrom, columnsTo } = action === 'push' + ? SQLiteSquasher.unsquashPushFK(fkData) + : SQLiteSquasher.unsquashFK(fkData); + + // When trying to alter table in lib sql it is necessary to pass all config for column like "NOT NULL", "DEFAULT", etc. + // If it is multicolumn reference it is not possible to pass this data for all columns + // Pass multicolumn flag for sql statements to not generate migration + let isMulticolumn = false; + + if (columnsFrom.length > 1 || columnsTo.length > 1) { + isMulticolumn = true; + + return { + type: 'create_reference', + tableName, + data: fkData, + schema, + isMulticolumn, + }; + } + + const columnFrom = columnsFrom[0]; + + const { + notNull: columnNotNull, + default: columnDefault, + type: columnType, + } = json2.tables[tableFrom].columns[columnFrom]; + + return { + type: 'create_reference', + tableName, + data: fkData, + schema, + columnNotNull, + columnDefault, + columnType, + }; + }); +}; export const prepareDropReferencesJson = ( tableName: string, @@ -2345,6 +2469,77 @@ export const prepareDropReferencesJson = ( }; }); }; +export const prepareLibSQLDropReferencesJson = ( + tableName: string, + schema: string, + foreignKeys: Record, + json2: SQLiteSchemaSquashed, + meta: SQLiteSchemaInternal['_meta'], + action?: 'push', +): JsonDeleteReferenceStatement[] => { + const statements = Object.values(foreignKeys).map((fkData) => { + const { columnsFrom, tableFrom, columnsTo, name, tableTo, onDelete, onUpdate } = action === 'push' + ? SQLiteSquasher.unsquashPushFK(fkData) + : SQLiteSquasher.unsquashFK(fkData); + + // If all columns from where were references were deleted -> skip this logic + // Drop columns will cover this scenario + const keys = Object.keys(json2.tables[tableName].columns); + const filtered = columnsFrom.filter((it) => keys.includes(it)); + const fullDrop = filtered.length === 0; + if (fullDrop) return; + + // When trying to alter table in lib sql it is necessary to pass all config for column like "NOT NULL", "DEFAULT", etc. + // If it is multicolumn reference it is not possible to pass this data for all columns + // Pass multicolumn flag for sql statements to not generate migration + let isMulticolumn = false; + + if (columnsFrom.length > 1 || columnsTo.length > 1) { + isMulticolumn = true; + + return { + type: 'delete_reference', + tableName, + data: fkData, + schema, + isMulticolumn, + }; + } + + const columnFrom = columnsFrom[0]; + const newTableName = getNewTableName(tableFrom, meta); + + const { + notNull: columnNotNull, + default: columnDefault, + type: columnType, + } = json2.tables[newTableName].columns[columnFrom]; + + const fkToSquash = { + columnsFrom, + columnsTo, + name, + tableFrom: newTableName, + tableTo, + onDelete, + onUpdate, + }; + const foreignKey = action === 'push' + ? SQLiteSquasher.squashPushFK(fkToSquash) + : SQLiteSquasher.squashFK(fkToSquash); + return { + type: 'delete_reference', + tableName, + data: foreignKey, + schema, + columnNotNull, + columnDefault, + columnType, + }; + }); + + return statements.filter((it) => it) as JsonDeleteReferenceStatement[]; +}; // alter should create 2 statements. It's important to make only 1 sql per statement(for breakpoints) export const prepareAlterReferencesJson = ( diff --git a/drizzle-kit/src/schemaValidator.ts b/drizzle-kit/src/schemaValidator.ts index 712252f37..e91b5ab11 100644 --- a/drizzle-kit/src/schemaValidator.ts +++ b/drizzle-kit/src/schemaValidator.ts @@ -4,7 +4,7 @@ import { pgSchema, pgSchemaSquashed } from './serializer/pgSchema'; import { singlestoreSchema, singlestoreSchemaSquashed } from './serializer/singlestoreSchema'; import { sqliteSchema, SQLiteSchemaSquashed } from './serializer/sqliteSchema'; -export const dialects = ['postgresql', 'mysql', 'sqlite', 'singlestore'] as const; +export const dialects = ['postgresql', 'mysql', 'sqlite', 'turso', 'singlestore'] as const; export const dialect = enumType(dialects); export type Dialect = (typeof dialects)[number]; diff --git a/drizzle-kit/src/serializer/singlestoreSerializer.ts b/drizzle-kit/src/serializer/singlestoreSerializer.ts index f275273f4..fc91becf2 100644 --- a/drizzle-kit/src/serializer/singlestoreSerializer.ts +++ b/drizzle-kit/src/serializer/singlestoreSerializer.ts @@ -1,13 +1,12 @@ import chalk from 'chalk'; -import { getTableName, is } from 'drizzle-orm'; -import { SQL } from 'drizzle-orm'; +import { is, SQL } from 'drizzle-orm'; import { AnySingleStoreTable, + getTableConfig, type PrimaryKey as PrimaryKeyORM, SingleStoreDialect, uniqueKeyName, } from 'drizzle-orm/singlestore-core'; -import { getTableConfig } from 'drizzle-orm/singlestore-core'; import { RowDataPacket } from 'mysql2/promise'; import { withStyle } from '../cli/validations/outputs'; import { IntrospectStage, IntrospectStatus } from '../cli/views'; diff --git a/drizzle-kit/src/serializer/sqliteSerializer.ts b/drizzle-kit/src/serializer/sqliteSerializer.ts index ce544235b..41edd78a9 100644 --- a/drizzle-kit/src/serializer/sqliteSerializer.ts +++ b/drizzle-kit/src/serializer/sqliteSerializer.ts @@ -363,7 +363,6 @@ export const fromDatabase = async ( ) => void, ): Promise => { const result: Record = {}; - const columns = await db.query<{ tableName: string; columnName: string; diff --git a/drizzle-kit/src/serializer/studio.ts b/drizzle-kit/src/serializer/studio.ts index 5515e6f59..12ea8207c 100644 --- a/drizzle-kit/src/serializer/studio.ts +++ b/drizzle-kit/src/serializer/studio.ts @@ -25,6 +25,7 @@ import fs from 'fs'; import { Hono } from 'hono'; import { cors } from 'hono/cors'; import { createServer } from 'node:https'; +import { LibSQLCredentials } from 'src/cli/validations/libsql'; import { assertUnreachable } from 'src/global'; import superjson from 'superjson'; import { z } from 'zod'; @@ -342,8 +343,6 @@ export const drizzleForSQLite = async ( const { driver } = credentials; if (driver === 'd1-http') { dbUrl = `d1-http://${credentials.accountId}/${credentials.databaseId}/${credentials.token}`; - } else if (driver === 'turso') { - dbUrl = `turso://${credentials.url}/${credentials.authToken}`; } else { assertUnreachable(driver); } @@ -364,6 +363,32 @@ export const drizzleForSQLite = async ( schemaFiles, }; }; +export const drizzleForLibSQL = async ( + credentials: LibSQLCredentials, + sqliteSchema: Record>, + relations: Record, + schemaFiles?: SchemaFile[], +): Promise => { + const { connectToLibSQL } = await import('../cli/connections'); + + const sqliteDB = await connectToLibSQL(credentials); + const customDefaults = getCustomDefaults(sqliteSchema); + + let dbUrl: string = `turso://${credentials.url}/${credentials.authToken}`; + + const dbHash = createHash('sha256').update(dbUrl).digest('hex'); + + return { + dbHash, + dialect: 'sqlite', + driver: undefined, + proxy: sqliteDB.proxy, + customDefaults, + schema: sqliteSchema, + relations, + schemaFiles, + }; +}; export const drizzleForSingleStore = async ( credentials: SingleStoreCredentials, diff --git a/drizzle-kit/src/snapshotsDiffer.ts b/drizzle-kit/src/snapshotsDiffer.ts index 5b6c782c2..6f27a2505 100644 --- a/drizzle-kit/src/snapshotsDiffer.ts +++ b/drizzle-kit/src/snapshotsDiffer.ts @@ -5,7 +5,6 @@ import { enum as enumType, literal, never, - number, object, record, string, @@ -64,6 +63,8 @@ import { prepareDropReferencesJson, prepareDropSequenceJson, prepareDropTableJson, + prepareLibSQLCreateReferencesJson, + prepareLibSQLDropReferencesJson, prepareMoveEnumJson, prepareMoveSequenceJson, prepareMySqlCreateTableJson, @@ -83,9 +84,10 @@ import { import { Named, NamedWithSchema } from './cli/commands/migrate'; import { mapEntries, mapKeys, mapValues } from './global'; import { MySqlSchema, MySqlSchemaSquashed, MySqlSquasher } from './serializer/mysqlSchema'; -import { PgSchema, PgSchemaSquashed, PgSquasher, sequenceSchema, sequenceSquashed } from './serializer/pgSchema'; +import { PgSchema, PgSchemaSquashed, sequenceSquashed } from './serializer/pgSchema'; import { SingleStoreSchema, SingleStoreSchemaSquashed, SingleStoreSquasher } from './serializer/singlestoreSchema'; import { SQLiteSchema, SQLiteSchemaSquashed, SQLiteSquasher } from './serializer/sqliteSchema'; +import { libSQLCombineStatements, sqliteCombineStatements } from './statementCombiner'; import { copy, prepareMigrationMeta } from './utils'; const makeChanged = (schema: T) => { @@ -2469,7 +2471,8 @@ export const applySqliteSnapshotsDiff = async ( jsonStatements.push(...jsonAlteredUniqueConstraints); - const sqlStatements = fromJson(jsonStatements, 'sqlite'); + const combinedJsonStatements = sqliteCombineStatements(jsonStatements, json2, action); + const sqlStatements = fromJson(combinedJsonStatements, 'sqlite'); const uniqueSqlStatements: string[] = []; sqlStatements.forEach((ss) => { @@ -2485,7 +2488,428 @@ export const applySqliteSnapshotsDiff = async ( const _meta = prepareMigrationMeta([], rTables, rColumns); return { - statements: jsonStatements, + statements: combinedJsonStatements, + sqlStatements: uniqueSqlStatements, + _meta, + }; +}; + +export const applyLibSQLSnapshotsDiff = async ( + json1: SQLiteSchemaSquashed, + json2: SQLiteSchemaSquashed, + tablesResolver: ( + input: ResolverInput, + ) => Promise>, + columnsResolver: ( + input: ColumnsResolverInput, + ) => Promise>, + prevFull: SQLiteSchema, + curFull: SQLiteSchema, + action?: 'push', +): Promise<{ + statements: JsonStatement[]; + sqlStatements: string[]; + _meta: + | { + schemas: {}; + tables: {}; + columns: {}; + } + | undefined; +}> => { + const tablesDiff = diffSchemasOrTables(json1.tables, json2.tables); + const { + created: createdTables, + deleted: deletedTables, + renamed: renamedTables, + } = await tablesResolver({ + created: tablesDiff.added, + deleted: tablesDiff.deleted, + }); + + const tablesPatchedSnap1 = copy(json1); + tablesPatchedSnap1.tables = mapEntries(tablesPatchedSnap1.tables, (_, it) => { + const { name } = nameChangeFor(it, renamedTables); + it.name = name; + return [name, it]; + }); + + const res = diffColumns(tablesPatchedSnap1.tables, json2.tables); + + const columnRenames = [] as { + table: string; + renames: { from: Column; to: Column }[]; + }[]; + + const columnCreates = [] as { + table: string; + columns: Column[]; + }[]; + + const columnDeletes = [] as { + table: string; + columns: Column[]; + }[]; + + for (let entry of Object.values(res)) { + const { renamed, created, deleted } = await columnsResolver({ + tableName: entry.name, + schema: entry.schema, + deleted: entry.columns.deleted, + created: entry.columns.added, + }); + + if (created.length > 0) { + columnCreates.push({ + table: entry.name, + columns: created, + }); + } + + if (deleted.length > 0) { + columnDeletes.push({ + table: entry.name, + columns: deleted, + }); + } + + if (renamed.length > 0) { + columnRenames.push({ + table: entry.name, + renames: renamed, + }); + } + } + + const columnRenamesDict = columnRenames.reduce( + (acc, it) => { + acc[it.table] = it.renames; + return acc; + }, + {} as Record< + string, + { + from: Named; + to: Named; + }[] + >, + ); + + const columnsPatchedSnap1 = copy(tablesPatchedSnap1); + columnsPatchedSnap1.tables = mapEntries( + columnsPatchedSnap1.tables, + (tableKey, tableValue) => { + const patchedColumns = mapKeys( + tableValue.columns, + (columnKey, column) => { + const rens = columnRenamesDict[tableValue.name] || []; + const newName = columnChangeFor(columnKey, rens); + column.name = newName; + return newName; + }, + ); + + tableValue.columns = patchedColumns; + return [tableKey, tableValue]; + }, + ); + + const diffResult = applyJsonDiff(columnsPatchedSnap1, json2); + + const typedResult = diffResultSchemeSQLite.parse(diffResult); + + // Map array of objects to map + const tablesMap: { + [key: string]: (typeof typedResult.alteredTablesWithColumns)[number]; + } = {}; + + typedResult.alteredTablesWithColumns.forEach((obj) => { + tablesMap[obj.name] = obj; + }); + + const jsonCreateTables = createdTables.map((it) => { + return prepareSQLiteCreateTable(it, action); + }); + + const jsonCreateIndexesForCreatedTables = createdTables + .map((it) => { + return prepareCreateIndexesJson( + it.name, + it.schema, + it.indexes, + curFull.internal, + ); + }) + .flat(); + + const jsonDropTables = deletedTables.map((it) => { + return prepareDropTableJson(it); + }); + + const jsonRenameTables = renamedTables.map((it) => { + return prepareRenameTableJson(it.from, it.to); + }); + + const jsonRenameColumnsStatements: JsonRenameColumnStatement[] = columnRenames + .map((it) => prepareRenameColumns(it.table, '', it.renames)) + .flat(); + + const jsonDropColumnsStatemets: JsonDropColumnStatement[] = columnDeletes + .map((it) => _prepareDropColumns(it.table, '', it.columns)) + .flat(); + + const jsonAddColumnsStatemets: JsonSqliteAddColumnStatement[] = columnCreates + .map((it) => { + return _prepareSqliteAddColumns( + it.table, + it.columns, + tablesMap[it.table] && tablesMap[it.table].addedForeignKeys + ? Object.values(tablesMap[it.table].addedForeignKeys) + : [], + ); + }) + .flat(); + + const rColumns = jsonRenameColumnsStatements.map((it) => { + const tableName = it.tableName; + const schema = it.schema; + return { + from: { schema, table: tableName, column: it.oldColumnName }, + to: { schema, table: tableName, column: it.newColumnName }, + }; + }); + + const rTables = renamedTables.map((it) => { + return { from: it.from, to: it.to }; + }); + + const _meta = prepareMigrationMeta([], rTables, rColumns); + + const allAltered = typedResult.alteredTablesWithColumns; + + const jsonAddedCompositePKs: JsonCreateCompositePK[] = []; + const jsonDeletedCompositePKs: JsonDeleteCompositePK[] = []; + const jsonAlteredCompositePKs: JsonAlterCompositePK[] = []; + + const jsonAddedUniqueConstraints: JsonCreateUniqueConstraint[] = []; + const jsonDeletedUniqueConstraints: JsonDeleteUniqueConstraint[] = []; + const jsonAlteredUniqueConstraints: JsonAlterUniqueConstraint[] = []; + + allAltered.forEach((it) => { + // This part is needed to make sure that same columns in a table are not triggered for change + // there is a case where orm and kit are responsible for pk name generation and one of them is not sorting name + // We double-check that pk with same set of columns are both in added and deleted diffs + let addedColumns: string[] = []; + for (const addedPkName of Object.keys(it.addedCompositePKs)) { + const addedPkColumns = it.addedCompositePKs[addedPkName]; + addedColumns = SQLiteSquasher.unsquashPK(addedPkColumns); + } + + let deletedColumns: string[] = []; + for (const deletedPkName of Object.keys(it.deletedCompositePKs)) { + const deletedPkColumns = it.deletedCompositePKs[deletedPkName]; + deletedColumns = SQLiteSquasher.unsquashPK(deletedPkColumns); + } + + // Don't need to sort, but need to add tests for it + // addedColumns.sort(); + // deletedColumns.sort(); + + const doPerformDeleteAndCreate = JSON.stringify(addedColumns) !== JSON.stringify(deletedColumns); + + let addedCompositePKs: JsonCreateCompositePK[] = []; + let deletedCompositePKs: JsonDeleteCompositePK[] = []; + let alteredCompositePKs: JsonAlterCompositePK[] = []; + if (doPerformDeleteAndCreate) { + addedCompositePKs = prepareAddCompositePrimaryKeySqlite( + it.name, + it.addedCompositePKs, + ); + deletedCompositePKs = prepareDeleteCompositePrimaryKeySqlite( + it.name, + it.deletedCompositePKs, + ); + } + alteredCompositePKs = prepareAlterCompositePrimaryKeySqlite( + it.name, + it.alteredCompositePKs, + ); + + // add logic for unique constraints + let addedUniqueConstraints: JsonCreateUniqueConstraint[] = []; + let deletedUniqueConstraints: JsonDeleteUniqueConstraint[] = []; + let alteredUniqueConstraints: JsonAlterUniqueConstraint[] = []; + + addedUniqueConstraints = prepareAddUniqueConstraint( + it.name, + it.schema, + it.addedUniqueConstraints, + ); + + deletedUniqueConstraints = prepareDeleteUniqueConstraint( + it.name, + it.schema, + it.deletedUniqueConstraints, + ); + if (it.alteredUniqueConstraints) { + const added: Record = {}; + const deleted: Record = {}; + for (const k of Object.keys(it.alteredUniqueConstraints)) { + added[k] = it.alteredUniqueConstraints[k].__new; + deleted[k] = it.alteredUniqueConstraints[k].__old; + } + addedUniqueConstraints.push( + ...prepareAddUniqueConstraint(it.name, it.schema, added), + ); + deletedUniqueConstraints.push( + ...prepareDeleteUniqueConstraint(it.name, it.schema, deleted), + ); + } + + jsonAddedCompositePKs.push(...addedCompositePKs); + jsonDeletedCompositePKs.push(...deletedCompositePKs); + jsonAlteredCompositePKs.push(...alteredCompositePKs); + + jsonAddedUniqueConstraints.push(...addedUniqueConstraints); + jsonDeletedUniqueConstraints.push(...deletedUniqueConstraints); + jsonAlteredUniqueConstraints.push(...alteredUniqueConstraints); + }); + + const jsonTableAlternations = allAltered + .map((it) => { + return prepareSqliteAlterColumns(it.name, it.schema, it.altered, json2); + }) + .flat(); + + const jsonCreateIndexesForAllAlteredTables = allAltered + .map((it) => { + return prepareCreateIndexesJson( + it.name, + it.schema, + it.addedIndexes || {}, + curFull.internal, + ); + }) + .flat(); + + const jsonDropIndexesForAllAlteredTables = allAltered + .map((it) => { + return prepareDropIndexesJson( + it.name, + it.schema, + it.deletedIndexes || {}, + ); + }) + .flat(); + + allAltered.forEach((it) => { + const droppedIndexes = Object.keys(it.alteredIndexes).reduce( + (current, item: string) => { + current[item] = it.alteredIndexes[item].__old; + return current; + }, + {} as Record, + ); + const createdIndexes = Object.keys(it.alteredIndexes).reduce( + (current, item: string) => { + current[item] = it.alteredIndexes[item].__new; + return current; + }, + {} as Record, + ); + + jsonCreateIndexesForAllAlteredTables.push( + ...prepareCreateIndexesJson( + it.name, + it.schema, + createdIndexes || {}, + curFull.internal, + ), + ); + jsonDropIndexesForAllAlteredTables.push( + ...prepareDropIndexesJson(it.name, it.schema, droppedIndexes || {}), + ); + }); + + const jsonReferencesForAllAlteredTables: JsonReferenceStatement[] = allAltered + .map((it) => { + const forAdded = prepareLibSQLCreateReferencesJson( + it.name, + it.schema, + it.addedForeignKeys, + json2, + action, + ); + + const forAltered = prepareLibSQLDropReferencesJson( + it.name, + it.schema, + it.deletedForeignKeys, + json2, + _meta, + action, + ); + + const alteredFKs = prepareAlterReferencesJson(it.name, it.schema, it.alteredForeignKeys); + + return [...forAdded, ...forAltered, ...alteredFKs]; + }) + .flat(); + + const jsonCreatedReferencesForAlteredTables = jsonReferencesForAllAlteredTables.filter( + (t) => t.type === 'create_reference', + ); + const jsonDroppedReferencesForAlteredTables = jsonReferencesForAllAlteredTables.filter( + (t) => t.type === 'delete_reference', + ); + + const jsonStatements: JsonStatement[] = []; + jsonStatements.push(...jsonCreateTables); + + jsonStatements.push(...jsonDropTables); + jsonStatements.push(...jsonRenameTables); + jsonStatements.push(...jsonRenameColumnsStatements); + + jsonStatements.push(...jsonDroppedReferencesForAlteredTables); + + // Will need to drop indexes before changing any columns in table + // Then should go column alternations and then index creation + jsonStatements.push(...jsonDropIndexesForAllAlteredTables); + + jsonStatements.push(...jsonDeletedCompositePKs); + jsonStatements.push(...jsonTableAlternations); + jsonStatements.push(...jsonAddedCompositePKs); + jsonStatements.push(...jsonAddColumnsStatemets); + + jsonStatements.push(...jsonCreateIndexesForCreatedTables); + jsonStatements.push(...jsonCreateIndexesForAllAlteredTables); + + jsonStatements.push(...jsonCreatedReferencesForAlteredTables); + + jsonStatements.push(...jsonDropColumnsStatemets); + + jsonStatements.push(...jsonAlteredCompositePKs); + + jsonStatements.push(...jsonAlteredUniqueConstraints); + + const combinedJsonStatements = libSQLCombineStatements(jsonStatements, json2, action); + + const sqlStatements = fromJson( + combinedJsonStatements, + 'turso', + action, + json2, + ); + + const uniqueSqlStatements: string[] = []; + sqlStatements.forEach((ss) => { + if (!uniqueSqlStatements.includes(ss)) { + uniqueSqlStatements.push(ss); + } + }); + + return { + statements: combinedJsonStatements, sqlStatements: uniqueSqlStatements, _meta, }; diff --git a/drizzle-kit/src/sqlgenerator.ts b/drizzle-kit/src/sqlgenerator.ts index 07b24b6c9..f1e783a50 100644 --- a/drizzle-kit/src/sqlgenerator.ts +++ b/drizzle-kit/src/sqlgenerator.ts @@ -42,6 +42,7 @@ import { JsonDropTableStatement, JsonMoveSequenceStatement, JsonPgCreateIndexStatement, + JsonRecreateTableStatement, JsonRenameColumnStatement, JsonRenameSchema, JsonRenameSequenceStatement, @@ -54,7 +55,7 @@ import { Dialect } from './schemaValidator'; import { MySqlSquasher } from './serializer/mysqlSchema'; import { PgSquasher } from './serializer/pgSchema'; import { SingleStoreSquasher } from './serializer/singlestoreSchema'; -import { SQLiteSquasher } from './serializer/sqliteSchema'; +import { SQLiteSchemaSquashed, SQLiteSquasher } from './serializer/sqliteSchema'; export const pgNativeTypes = new Set([ 'uuid', @@ -127,8 +128,15 @@ const isPgNativeType = (it: string) => { }; abstract class Convertor { - abstract can(statement: JsonStatement, dialect: Dialect): boolean; - abstract convert(statement: JsonStatement): string | string[]; + abstract can( + statement: JsonStatement, + dialect: Dialect, + ): boolean; + abstract convert( + statement: JsonStatement, + json2?: SQLiteSchemaSquashed, + action?: 'push', + ): string | string[]; } class PgCreateTableConvertor extends Convertor { @@ -382,7 +390,7 @@ class SingleStoreCreateTableConvertor extends Convertor { export class SQLiteCreateTableConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { - return statement.type === 'sqlite_create_table' && dialect === 'sqlite'; + return statement.type === 'sqlite_create_table' && (dialect === 'sqlite' || dialect === 'turso'); } convert(st: JsonSqliteCreateTableStatement) { @@ -888,7 +896,7 @@ class SingleStoreDropTableConvertor extends Convertor { export class SQLiteDropTableConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { - return statement.type === 'drop_table' && dialect === 'sqlite'; + return statement.type === 'drop_table' && (dialect === 'sqlite' || dialect === 'turso'); } convert(statement: JsonDropTableStatement) { @@ -914,7 +922,7 @@ class PgRenameTableConvertor extends Convertor { export class SqliteRenameTableConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { - return statement.type === 'rename_table' && dialect === 'sqlite'; + return statement.type === 'rename_table' && (dialect === 'sqlite' || dialect === 'turso'); } convert(statement: JsonRenameTableStatement) { @@ -992,13 +1000,13 @@ class SingleStoreAlterTableRenameColumnConvertor extends Convertor { class SQLiteAlterTableRenameColumnConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { return ( - statement.type === 'alter_table_rename_column' && dialect === 'sqlite' + statement.type === 'alter_table_rename_column' && (dialect === 'sqlite' || dialect === 'turso') ); } convert(statement: JsonRenameColumnStatement) { const { tableName, oldColumnName, newColumnName } = statement; - return `ALTER TABLE \`${tableName}\` RENAME COLUMN \`${oldColumnName}\` TO \`${newColumnName}\`;`; + return `ALTER TABLE \`${tableName}\` RENAME COLUMN "${oldColumnName}" TO "${newColumnName}";`; } } @@ -1044,7 +1052,7 @@ class SingleStoreAlterTableDropColumnConvertor extends Convertor { class SQLiteAlterTableDropColumnConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { - return statement.type === 'alter_table_drop_column' && dialect === 'sqlite'; + return statement.type === 'alter_table_drop_column' && (dialect === 'sqlite' || dialect === 'turso'); } convert(statement: JsonDropColumnStatement) { @@ -1185,7 +1193,7 @@ class SingleStoreAlterTableAddColumnConvertor extends Convertor { export class SQLiteAlterTableAddColumnConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { return ( - statement.type === 'sqlite_alter_table_add_column' && dialect === 'sqlite' + statement.type === 'sqlite_alter_table_add_column' && (dialect === 'sqlite' || dialect === 'turso') ); } @@ -1232,26 +1240,6 @@ class PgAlterTableAlterColumnSetTypeConvertor extends Convertor { } } -class SQLiteAlterTableAlterColumnSetTypeConvertor extends Convertor { - can(statement: JsonStatement, dialect: Dialect): boolean { - return ( - statement.type === 'alter_table_alter_column_set_type' - && dialect === 'sqlite' - ); - } - - convert(statement: JsonAlterColumnTypeStatement) { - return ( - '/*\n SQLite does not support "Changing existing column type" out of the box, we do not generate automatic migration for that, so it has to be done manually' - + '\n Please refer to: https://www.techonthenet.com/sqlite/tables/alter_table.php' - + '\n https://www.sqlite.org/lang_altertable.html' - + '\n https://stackoverflow.com/questions/2083543/modify-a-columns-type-in-sqlite3' - + "\n\n Due to that we don't generate migration automatically and it has to be done manually" - + '\n*/' - ); - } -} - class PgAlterTableAlterColumnSetDefaultConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { return ( @@ -1271,26 +1259,6 @@ class PgAlterTableAlterColumnSetDefaultConvertor extends Convertor { } } -class SqliteAlterTableAlterColumnSetDefaultConvertor extends Convertor { - can(statement: JsonStatement, dialect: Dialect): boolean { - return ( - statement.type === 'alter_table_alter_column_set_default' - && dialect === 'sqlite' - ); - } - - convert(statement: JsonAlterColumnSetDefaultStatement) { - return ( - '/*\n SQLite does not support "Set default to column" out of the box, we do not generate automatic migration for that, so it has to be done manually' - + '\n Please refer to: https://www.techonthenet.com/sqlite/tables/alter_table.php' - + '\n https://www.sqlite.org/lang_altertable.html' - + '\n https://stackoverflow.com/questions/2083543/modify-a-columns-type-in-sqlite3' - + "\n\n Due to that we don't generate migration automatically and it has to be done manually" - + '\n*/' - ); - } -} - class PgAlterTableAlterColumnDropDefaultConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { return ( @@ -1430,7 +1398,7 @@ class SqliteAlterTableAlterColumnDropGeneratedConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { return ( statement.type === 'alter_table_alter_column_drop_generated' - && dialect === 'sqlite' + && (dialect === 'sqlite' || dialect === 'turso') ); } @@ -1479,7 +1447,7 @@ class SqliteAlterTableAlterColumnSetExpressionConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { return ( statement.type === 'alter_table_alter_column_set_generated' - && dialect === 'sqlite' + && (dialect === 'sqlite' || dialect === 'turso') ); } @@ -1528,7 +1496,7 @@ class SqliteAlterTableAlterColumnAlterGeneratedConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { return ( statement.type === 'alter_table_alter_column_alter_generated' - && dialect === 'sqlite' + && (dialect === 'sqlite' || dialect === 'turso') ); } @@ -1675,6 +1643,119 @@ class MySqlAlterTableDropPk extends Convertor { } } +type LibSQLModifyColumnStatement = + | JsonAlterColumnTypeStatement + | JsonAlterColumnDropNotNullStatement + | JsonAlterColumnSetNotNullStatement + | JsonAlterColumnSetDefaultStatement + | JsonAlterColumnDropDefaultStatement; + +export class LibSQLModifyColumn extends Convertor { + can(statement: JsonStatement, dialect: Dialect): boolean { + return ( + (statement.type === 'alter_table_alter_column_set_type' + || statement.type === 'alter_table_alter_column_drop_notnull' + || statement.type === 'alter_table_alter_column_set_notnull' + || statement.type === 'alter_table_alter_column_set_default' + || statement.type === 'alter_table_alter_column_drop_default') + && dialect === 'turso' + ); + } + + convert(statement: LibSQLModifyColumnStatement, json2: SQLiteSchemaSquashed) { + const { tableName, columnName } = statement; + + let columnType = ``; + let columnDefault: any = ''; + let columnNotNull = ''; + + const sqlStatements: string[] = []; + + // collect index info + const indexes: { + name: string; + tableName: string; + columns: string[]; + isUnique: boolean; + where?: string | undefined; + }[] = []; + for (const table of Object.values(json2.tables)) { + for (const index of Object.values(table.indexes)) { + const unsquashed = SQLiteSquasher.unsquashIdx(index); + sqlStatements.push(`DROP INDEX IF EXISTS "${unsquashed.name}";`); + indexes.push({ ...unsquashed, tableName: table.name }); + } + } + + switch (statement.type) { + case 'alter_table_alter_column_set_type': + columnType = ` ${statement.newDataType}`; + + columnDefault = statement.columnDefault + ? ` DEFAULT ${statement.columnDefault}` + : ''; + + columnNotNull = statement.columnNotNull ? ` NOT NULL` : ''; + + break; + case 'alter_table_alter_column_drop_notnull': + columnType = ` ${statement.newDataType}`; + + columnDefault = statement.columnDefault + ? ` DEFAULT ${statement.columnDefault}` + : ''; + + columnNotNull = ''; + break; + case 'alter_table_alter_column_set_notnull': + columnType = ` ${statement.newDataType}`; + + columnDefault = statement.columnDefault + ? ` DEFAULT ${statement.columnDefault}` + : ''; + + columnNotNull = ` NOT NULL`; + break; + case 'alter_table_alter_column_set_default': + columnType = ` ${statement.newDataType}`; + + columnDefault = ` DEFAULT ${statement.newDefaultValue}`; + + columnNotNull = statement.columnNotNull ? ` NOT NULL` : ''; + break; + case 'alter_table_alter_column_drop_default': + columnType = ` ${statement.newDataType}`; + + columnDefault = ''; + + columnNotNull = statement.columnNotNull ? ` NOT NULL` : ''; + break; + } + + // Seems like getting value from simple json2 shanpshot makes dates be dates + columnDefault = columnDefault instanceof Date + ? columnDefault.toISOString() + : columnDefault; + + sqlStatements.push( + `ALTER TABLE \`${tableName}\` ALTER COLUMN "${columnName}" TO "${columnName}"${columnType}${columnNotNull}${columnDefault};`, + ); + + for (const index of indexes) { + const indexPart = index.isUnique ? 'UNIQUE INDEX' : 'INDEX'; + const whereStatement = index.where ? ` WHERE ${index.where}` : ''; + const uniqueString = index.columns.map((it) => `\`${it}\``).join(','); + const tableName = index.tableName; + + sqlStatements.push( + `CREATE ${indexPart} \`${index.name}\` ON \`${tableName}\` (${uniqueString})${whereStatement};`, + ); + } + + return sqlStatements; + } +} + type MySqlModifyColumnStatement = | JsonAlterColumnDropNotNullStatement | JsonAlterColumnSetNotNullStatement @@ -2281,7 +2362,6 @@ class PgAlterTableCreateCompositePrimaryKeyConvertor extends Convertor { }");`; } } - class PgAlterTableDeleteCompositePrimaryKeyConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { return statement.type === 'delete_composite_pk' && dialect === 'postgresql'; @@ -2541,66 +2621,6 @@ class PgAlterTableAlterColumnSetNotNullConvertor extends Convertor { } } -class SqliteAlterTableAlterColumnSetNotNullConvertor extends Convertor { - can(statement: JsonStatement, dialect: Dialect): boolean { - return ( - statement.type === 'alter_table_alter_column_set_notnull' - && dialect === 'sqlite' - ); - } - - convert(statement: JsonAlterColumnSetNotNullStatement) { - return ( - '/*\n SQLite does not support "Set not null to column" out of the box, we do not generate automatic migration for that, so it has to be done manually' - + '\n Please refer to: https://www.techonthenet.com/sqlite/tables/alter_table.php' - + '\n https://www.sqlite.org/lang_altertable.html' - + '\n https://stackoverflow.com/questions/2083543/modify-a-columns-type-in-sqlite3' - + "\n\n Due to that we don't generate migration automatically and it has to be done manually" - + '\n*/' - ); - } -} - -class SqliteAlterTableAlterColumnSetAutoincrementConvertor extends Convertor { - can(statement: JsonStatement, dialect: Dialect): boolean { - return ( - statement.type === 'alter_table_alter_column_set_autoincrement' - && dialect === 'sqlite' - ); - } - - convert(statement: JsonAlterColumnSetAutoincrementStatement) { - return ( - '/*\n SQLite does not support "Set autoincrement to a column" out of the box, we do not generate automatic migration for that, so it has to be done manually' - + '\n Please refer to: https://www.techonthenet.com/sqlite/tables/alter_table.php' - + '\n https://www.sqlite.org/lang_altertable.html' - + '\n https://stackoverflow.com/questions/2083543/modify-a-columns-type-in-sqlite3' - + "\n\n Due to that we don't generate migration automatically and it has to be done manually" - + '\n*/' - ); - } -} - -class SqliteAlterTableAlterColumnDropAutoincrementConvertor extends Convertor { - can(statement: JsonStatement, dialect: Dialect): boolean { - return ( - statement.type === 'alter_table_alter_column_drop_autoincrement' - && dialect === 'sqlite' - ); - } - - convert(statement: JsonAlterColumnDropAutoincrementStatement) { - return ( - '/*\n SQLite does not support "Drop autoincrement from a column" out of the box, we do not generate automatic migration for that, so it has to be done manually' - + '\n Please refer to: https://www.techonthenet.com/sqlite/tables/alter_table.php' - + '\n https://www.sqlite.org/lang_altertable.html' - + '\n https://stackoverflow.com/questions/2083543/modify-a-columns-type-in-sqlite3' - + "\n\n Due to that we don't generate migration automatically and it has to be done manually" - + '\n*/' - ); - } -} - class PgAlterTableAlterColumnDropNotNullConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { return ( @@ -2620,26 +2640,6 @@ class PgAlterTableAlterColumnDropNotNullConvertor extends Convertor { } } -class SqliteAlterTableAlterColumnDropNotNullConvertor extends Convertor { - can(statement: JsonStatement, dialect: Dialect): boolean { - return ( - statement.type === 'alter_table_alter_column_drop_notnull' - && dialect === 'sqlite' - ); - } - - convert(statement: JsonAlterColumnDropNotNullStatement) { - return ( - '/*\n SQLite does not support "Drop not null from column" out of the box, we do not generate automatic migration for that, so it has to be done manually' - + '\n Please refer to: https://www.techonthenet.com/sqlite/tables/alter_table.php' - + '\n https://www.sqlite.org/lang_altertable.html' - + '\n https://stackoverflow.com/questions/2083543/modify-a-columns-type-in-sqlite3' - + "\n\n Due to that we don't generate migration automatically and it has to be done manually" - + '\n*/' - ); - } -} - // FK class PgCreateForeignKeyConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { @@ -2682,20 +2682,37 @@ class PgCreateForeignKeyConvertor extends Convertor { } } -class SqliteCreateForeignKeyConvertor extends Convertor { +class LibSQLCreateForeignKeyConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { - return statement.type === 'create_reference' && dialect === 'sqlite'; - } - - convert(statement: JsonCreateReferenceStatement): string { return ( - '/*\n SQLite does not support "Creating foreign key on existing column" out of the box, we do not generate automatic migration for that, so it has to be done manually' - + '\n Please refer to: https://www.techonthenet.com/sqlite/tables/alter_table.php' - + '\n https://www.sqlite.org/lang_altertable.html' - + "\n\n Due to that we don't generate migration automatically and it has to be done manually" - + '\n*/' + statement.type === 'create_reference' + && dialect === 'turso' ); } + + convert( + statement: JsonCreateReferenceStatement, + json2?: SQLiteSchemaSquashed, + action?: 'push', + ): string { + const { columnsFrom, columnsTo, tableFrom, onDelete, onUpdate, tableTo } = action === 'push' + ? SQLiteSquasher.unsquashPushFK(statement.data) + : SQLiteSquasher.unsquashFK(statement.data); + const { columnDefault, columnNotNull, columnType } = statement; + + const onDeleteStatement = onDelete ? ` ON DELETE ${onDelete}` : ''; + const onUpdateStatement = onUpdate ? ` ON UPDATE ${onUpdate}` : ''; + const columnsDefaultValue = columnDefault + ? ` DEFAULT ${columnDefault}` + : ''; + const columnNotNullValue = columnNotNull ? ` NOT NULL` : ''; + const columnTypeValue = columnType ? ` ${columnType}` : ''; + + const columnFrom = columnsFrom[0]; + const columnTo = columnsTo[0]; + + return `ALTER TABLE \`${tableFrom}\` ALTER COLUMN "${columnFrom}" TO "${columnFrom}"${columnTypeValue}${columnNotNullValue}${columnsDefaultValue} REFERENCES ${tableTo}(${columnTo})${onDeleteStatement}${onUpdateStatement};`; + } } class MySqlCreateForeignKeyConvertor extends Convertor { @@ -2769,22 +2786,6 @@ class PgAlterForeignKeyConvertor extends Convertor { } } -class SqliteAlterForeignKeyConvertor extends Convertor { - can(statement: JsonStatement, dialect: Dialect): boolean { - return statement.type === 'alter_reference' && dialect === 'sqlite'; - } - - convert(statement: JsonAlterReferenceStatement): string { - return ( - '/*\n SQLite does not support "Changing existing foreign key" out of the box, we do not generate automatic migration for that, so it has to be done manually' - + '\n Please refer to: https://www.techonthenet.com/sqlite/tables/alter_table.php' - + '\n https://www.sqlite.org/lang_altertable.html' - + "\n\n Due to that we don't generate migration automatically and it has to be done manually" - + '\n*/' - ); - } -} - class PgDeleteForeignKeyConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { return statement.type === 'delete_reference' && dialect === 'postgresql'; @@ -2802,22 +2803,6 @@ class PgDeleteForeignKeyConvertor extends Convertor { } } -class SqliteDeleteForeignKeyConvertor extends Convertor { - can(statement: JsonStatement, dialect: Dialect): boolean { - return statement.type === 'delete_reference' && dialect === 'sqlite'; - } - - convert(statement: JsonDeleteReferenceStatement): string { - return ( - '/*\n SQLite does not support "Dropping foreign key" out of the box, we do not generate automatic migration for that, so it has to be done manually' - + '\n Please refer to: https://www.techonthenet.com/sqlite/tables/alter_table.php' - + '\n https://www.sqlite.org/lang_altertable.html' - + "\n\n Due to that we don't generate migration automatically and it has to be done manually" - + '\n*/' - ); - } -} - class MySqlDeleteForeignKeyConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { return statement.type === 'delete_reference' && dialect === 'mysql'; @@ -2939,7 +2924,7 @@ class CreateSingleStoreIndexConvertor extends Convertor { export class CreateSqliteIndexConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { - return statement.type === 'create_index' && dialect === 'sqlite'; + return statement.type === 'create_index' && (dialect === 'sqlite' || dialect === 'turso'); } convert(statement: JsonCreateIndexStatement): string { @@ -3061,7 +3046,7 @@ class PgAlterTableRemoveFromSchemaConvertor extends Convertor { export class SqliteDropIndexConvertor extends Convertor { can(statement: JsonStatement, dialect: Dialect): boolean { - return statement.type === 'drop_index' && dialect === 'sqlite'; + return statement.type === 'drop_index' && (dialect === 'sqlite' || dialect === 'turso'); } convert(statement: JsonDropIndexStatement): string { @@ -3092,11 +3077,132 @@ class SingleStoreDropIndexConvertor extends Convertor { } } +class SQLiteRecreateTableConvertor extends Convertor { + can(statement: JsonStatement, dialect: Dialect): boolean { + return ( + statement.type === 'recreate_table' && dialect === 'sqlite' + ); + } + + convert(statement: JsonRecreateTableStatement): string | string[] { + const { tableName, columns, compositePKs, referenceData } = statement; + + const columnNames = columns.map((it) => `"${it.name}"`).join(', '); + const newTableName = `__new_${tableName}`; + + const sqlStatements: string[] = []; + + sqlStatements.push(`PRAGMA foreign_keys=OFF;`); + + // create new table + sqlStatements.push( + new SQLiteCreateTableConvertor().convert({ + type: 'sqlite_create_table', + tableName: newTableName, + columns, + referenceData, + compositePKs, + }), + ); + + // migrate data + sqlStatements.push( + `INSERT INTO \`${newTableName}\`(${columnNames}) SELECT ${columnNames} FROM \`${tableName}\`;`, + ); + + // drop table + sqlStatements.push( + new SQLiteDropTableConvertor().convert({ + type: 'drop_table', + tableName: tableName, + schema: '', + }), + ); + + // rename table + sqlStatements.push( + new SqliteRenameTableConvertor().convert({ + fromSchema: '', + tableNameFrom: newTableName, + tableNameTo: tableName, + toSchema: '', + type: 'rename_table', + }), + ); + + sqlStatements.push(`PRAGMA foreign_keys=ON;`); + + return sqlStatements; + } +} + +class LibSQLRecreateTableConvertor extends Convertor { + can(statement: JsonStatement, dialect: Dialect): boolean { + return ( + statement.type === 'recreate_table' + && dialect === 'turso' + ); + } + + convert(statement: JsonRecreateTableStatement): string[] { + const { tableName, columns, compositePKs, referenceData } = statement; + + const columnNames = columns.map((it) => `"${it.name}"`).join(', '); + const newTableName = `__new_${tableName}`; + + const sqlStatements: string[] = []; + + sqlStatements.push(`PRAGMA foreign_keys=OFF;`); + + // create new table + sqlStatements.push( + new SQLiteCreateTableConvertor().convert({ + type: 'sqlite_create_table', + tableName: newTableName, + columns, + referenceData, + compositePKs, + }), + ); + + // migrate data + sqlStatements.push( + `INSERT INTO \`${newTableName}\`(${columnNames}) SELECT ${columnNames} FROM \`${tableName}\`;`, + ); + + // drop table + sqlStatements.push( + new SQLiteDropTableConvertor().convert({ + type: 'drop_table', + tableName: tableName, + schema: '', + }), + ); + + // rename table + sqlStatements.push( + new SqliteRenameTableConvertor().convert({ + fromSchema: '', + tableNameFrom: newTableName, + tableNameTo: tableName, + toSchema: '', + type: 'rename_table', + }), + ); + + sqlStatements.push(`PRAGMA foreign_keys=ON;`); + + return sqlStatements; + } +} + const convertors: Convertor[] = []; convertors.push(new PgCreateTableConvertor()); convertors.push(new MySqlCreateTableConvertor()); convertors.push(new SingleStoreCreateTableConvertor()); convertors.push(new SQLiteCreateTableConvertor()); +convertors.push(new SQLiteRecreateTableConvertor()); +convertors.push(new LibSQLRecreateTableConvertor()); convertors.push(new CreateTypeEnumConvertor()); @@ -3175,6 +3281,7 @@ convertors.push(new SqliteAlterTableAlterColumnAlterGeneratedConvertor()); convertors.push(new SqliteAlterTableAlterColumnSetExpressionConvertor()); convertors.push(new MySqlModifyColumn()); +convertors.push(new LibSQLModifyColumn()); // convertors.push(new MySqlAlterTableAlterColumnSetDefaultConvertor()); // convertors.push(new MySqlAlterTableAlterColumnDropDefaultConvertor()); @@ -3195,31 +3302,12 @@ convertors.push(new PgAlterTableSetSchemaConvertor()); convertors.push(new PgAlterTableSetNewSchemaConvertor()); convertors.push(new PgAlterTableRemoveFromSchemaConvertor()); -// Unhandled sqlite queries, so they will appear last -convertors.push(new SQLiteAlterTableAlterColumnSetTypeConvertor()); -convertors.push(new SqliteAlterForeignKeyConvertor()); -convertors.push(new SqliteDeleteForeignKeyConvertor()); -convertors.push(new SqliteCreateForeignKeyConvertor()); - -convertors.push(new SQLiteAlterTableAddUniqueConstraintConvertor()); -convertors.push(new SQLiteAlterTableDropUniqueConstraintConvertor()); +convertors.push(new LibSQLCreateForeignKeyConvertor()); convertors.push(new PgAlterTableAlterColumnDropGenerated()); convertors.push(new PgAlterTableAlterColumnSetGenerated()); convertors.push(new PgAlterTableAlterColumnAlterGenerated()); -convertors.push(new SqliteAlterTableAlterColumnSetNotNullConvertor()); -convertors.push(new SqliteAlterTableAlterColumnDropNotNullConvertor()); -convertors.push(new SqliteAlterTableAlterColumnSetDefaultConvertor()); -convertors.push(new SqliteAlterTableAlterColumnDropDefaultConvertor()); - -convertors.push(new SqliteAlterTableAlterColumnSetAutoincrementConvertor()); -convertors.push(new SqliteAlterTableAlterColumnDropAutoincrementConvertor()); - -convertors.push(new SqliteAlterTableCreateCompositePrimaryKeyConvertor()); -convertors.push(new SqliteAlterTableDeleteCompositePrimaryKeyConvertor()); -convertors.push(new SqliteAlterTableAlterCompositePrimaryKeyConvertor()); - convertors.push(new PgAlterTableCreateCompositePrimaryKeyConvertor()); convertors.push(new PgAlterTableDeleteCompositePrimaryKeyConvertor()); convertors.push(new PgAlterTableAlterCompositePrimaryKeyConvertor()); @@ -3236,26 +3324,41 @@ convertors.push(new SingleStoreAlterTableCreateCompositePrimaryKeyConvertor()); convertors.push(new SingleStoreAlterTableAddPk()); convertors.push(new SingleStoreAlterTableAlterCompositePrimaryKeyConvertor()); -export const fromJson = (statements: JsonStatement[], dialect: Dialect) => { +// overloads for turso driver +export function fromJson( + statements: JsonStatement[], + dialect: Exclude, +): string[]; +export function fromJson( + statements: JsonStatement[], + dialect: 'sqlite' | 'turso', + action?: 'push', + json2?: SQLiteSchemaSquashed, +): string[]; + +export function fromJson( + statements: JsonStatement[], + dialect: Dialect, + action?: 'push', + json2?: SQLiteSchemaSquashed, +) { const result = statements .flatMap((statement) => { const filtered = convertors.filter((it) => { - // console.log(statement, dialect) return it.can(statement, dialect); }); const convertor = filtered.length === 1 ? filtered[0] : undefined; if (!convertor) { - // console.log("no convertor:", statement.type, dialect); return ''; } - return convertor.convert(statement); + return convertor.convert(statement, json2, action); }) .filter((it) => it !== ''); return result; -}; +} // blog.yo1.dog/updating-enum-values-in-postgresql-the-safe-and-easy-way/ // test case for enum altering diff --git a/drizzle-kit/src/statementCombiner.ts b/drizzle-kit/src/statementCombiner.ts new file mode 100644 index 000000000..2f7b6ddbe --- /dev/null +++ b/drizzle-kit/src/statementCombiner.ts @@ -0,0 +1,450 @@ +import { + JsonCreateIndexStatement, + JsonRecreateTableStatement, + JsonStatement, + prepareCreateIndexesJson, +} from './jsonStatements'; +import { SQLiteSchemaSquashed, SQLiteSquasher } from './serializer/sqliteSchema'; + +export const prepareLibSQLRecreateTable = ( + table: SQLiteSchemaSquashed['tables'][keyof SQLiteSchemaSquashed['tables']], + action?: 'push', +): (JsonRecreateTableStatement | JsonCreateIndexStatement)[] => { + const { name, columns, uniqueConstraints, indexes } = table; + + const composites: string[][] = Object.values(table.compositePrimaryKeys).map( + (it) => SQLiteSquasher.unsquashPK(it), + ); + + const references: string[] = Object.values(table.foreignKeys); + const fks = references.map((it) => + action === 'push' ? SQLiteSquasher.unsquashPushFK(it) : SQLiteSquasher.unsquashFK(it) + ); + + const statements: (JsonRecreateTableStatement | JsonCreateIndexStatement)[] = [ + { + type: 'recreate_table', + tableName: name, + columns: Object.values(columns), + compositePKs: composites, + referenceData: fks, + uniqueConstraints: Object.values(uniqueConstraints), + }, + ]; + + if (Object.keys(indexes).length) { + statements.push(...prepareCreateIndexesJson(name, '', indexes)); + } + return statements; +}; + +export const prepareSQLiteRecreateTable = ( + table: SQLiteSchemaSquashed['tables'][keyof SQLiteSchemaSquashed['tables']], + action?: 'push', +): JsonStatement[] => { + const { name, columns, uniqueConstraints, indexes } = table; + + const composites: string[][] = Object.values(table.compositePrimaryKeys).map( + (it) => SQLiteSquasher.unsquashPK(it), + ); + + const references: string[] = Object.values(table.foreignKeys); + const fks = references.map((it) => + action === 'push' ? SQLiteSquasher.unsquashPushFK(it) : SQLiteSquasher.unsquashFK(it) + ); + + const statements: JsonStatement[] = [ + { + type: 'recreate_table', + tableName: name, + columns: Object.values(columns), + compositePKs: composites, + referenceData: fks, + uniqueConstraints: Object.values(uniqueConstraints), + }, + ]; + + if (Object.keys(indexes).length) { + statements.push(...prepareCreateIndexesJson(name, '', indexes)); + } + return statements; +}; + +export const libSQLCombineStatements = ( + statements: JsonStatement[], + json2: SQLiteSchemaSquashed, + action?: 'push', +) => { + // const tablesContext: Record = {}; + const newStatements: Record = {}; + for (const statement of statements) { + if ( + statement.type === 'alter_table_alter_column_drop_autoincrement' + || statement.type === 'alter_table_alter_column_set_autoincrement' + || statement.type === 'alter_table_alter_column_drop_pk' + || statement.type === 'alter_table_alter_column_set_pk' + || statement.type === 'create_composite_pk' + || statement.type === 'alter_composite_pk' + || statement.type === 'delete_composite_pk' + ) { + const tableName = statement.tableName; + + const statementsForTable = newStatements[tableName]; + + if (!statementsForTable) { + newStatements[tableName] = prepareLibSQLRecreateTable(json2.tables[tableName], action); + + continue; + } + + if (!statementsForTable.some(({ type }) => type === 'recreate_table')) { + const wasRename = statementsForTable.some(({ type }) => type === 'rename_table'); + const preparedStatements = prepareLibSQLRecreateTable(json2.tables[tableName], action); + + if (wasRename) { + newStatements[tableName].push(...preparedStatements); + } else { + newStatements[tableName] = preparedStatements; + } + + continue; + } + + continue; + } + + if ( + statement.type === 'alter_table_alter_column_set_type' + || statement.type === 'alter_table_alter_column_drop_notnull' + || statement.type === 'alter_table_alter_column_set_notnull' + || statement.type === 'alter_table_alter_column_set_default' + || statement.type === 'alter_table_alter_column_drop_default' + ) { + const { tableName, columnName, columnPk } = statement; + + // const columnIsPartOfUniqueIndex = Object.values( + // json2.tables[tableName].indexes, + // ).some((it) => { + // const unsquashIndex = SQLiteSquasher.unsquashIdx(it); + + // return ( + // unsquashIndex.columns.includes(columnName) && unsquashIndex.isUnique + // ); + // }); + + const columnIsPartOfForeignKey = Object.values( + json2.tables[tableName].foreignKeys, + ).some((it) => { + const unsquashFk = action === 'push' ? SQLiteSquasher.unsquashPushFK(it) : SQLiteSquasher.unsquashFK(it); + + return ( + unsquashFk.columnsFrom.includes(columnName) + ); + }); + + const statementsForTable = newStatements[tableName]; + + if ( + !statementsForTable && (columnIsPartOfForeignKey || columnPk) + ) { + newStatements[tableName] = prepareLibSQLRecreateTable(json2.tables[tableName], action); + continue; + } + + if ( + statementsForTable && (columnIsPartOfForeignKey || columnPk) + ) { + if (!statementsForTable.some(({ type }) => type === 'recreate_table')) { + const wasRename = statementsForTable.some(({ type }) => type === 'rename_table'); + const preparedStatements = prepareLibSQLRecreateTable(json2.tables[tableName], action); + + if (wasRename) { + newStatements[tableName].push(...preparedStatements); + } else { + newStatements[tableName] = preparedStatements; + } + } + continue; + } + if ( + statementsForTable && !(columnIsPartOfForeignKey || columnPk) + ) { + if (!statementsForTable.some(({ type }) => type === 'recreate_table')) { + newStatements[tableName].push(statement); + } + continue; + } + + newStatements[tableName] = [statement]; + + continue; + } + + if (statement.type === 'create_reference') { + const tableName = statement.tableName; + + const data = action === 'push' + ? SQLiteSquasher.unsquashPushFK(statement.data) + : SQLiteSquasher.unsquashFK(statement.data); + + const statementsForTable = newStatements[tableName]; + + if (!statementsForTable) { + newStatements[tableName] = statement.isMulticolumn + ? prepareLibSQLRecreateTable(json2.tables[tableName], action) + : [statement]; + + continue; + } + + // if add column with reference -> skip create_reference statement + if ( + !statement.isMulticolumn + && statementsForTable.some((st) => + st.type === 'sqlite_alter_table_add_column' && st.column.name === data.columnsFrom[0] + ) + ) { + continue; + } + + if (statement.isMulticolumn) { + if (!statementsForTable.some(({ type }) => type === 'recreate_table')) { + const wasRename = statementsForTable.some(({ type }) => type === 'rename_table'); + const preparedStatements = prepareLibSQLRecreateTable(json2.tables[tableName], action); + + if (wasRename) { + newStatements[tableName].push(...preparedStatements); + } else { + newStatements[tableName] = preparedStatements; + } + + continue; + } + + continue; + } + + if (!statementsForTable.some(({ type }) => type === 'recreate_table')) { + newStatements[tableName].push(statement); + } + + continue; + } + + if (statement.type === 'delete_reference') { + const tableName = statement.tableName; + + const statementsForTable = newStatements[tableName]; + + if (!statementsForTable) { + newStatements[tableName] = prepareLibSQLRecreateTable(json2.tables[tableName], action); + continue; + } + + if (!statementsForTable.some(({ type }) => type === 'recreate_table')) { + const wasRename = statementsForTable.some(({ type }) => type === 'rename_table'); + const preparedStatements = prepareLibSQLRecreateTable(json2.tables[tableName], action); + + if (wasRename) { + newStatements[tableName].push(...preparedStatements); + } else { + newStatements[tableName] = preparedStatements; + } + + continue; + } + + continue; + } + + if (statement.type === 'sqlite_alter_table_add_column' && statement.column.primaryKey) { + const tableName = statement.tableName; + + const statementsForTable = newStatements[tableName]; + + if (!statementsForTable) { + newStatements[tableName] = prepareLibSQLRecreateTable(json2.tables[tableName], action); + continue; + } + + if (!statementsForTable.some(({ type }) => type === 'recreate_table')) { + const wasRename = statementsForTable.some(({ type }) => type === 'rename_table'); + const preparedStatements = prepareLibSQLRecreateTable(json2.tables[tableName], action); + + if (wasRename) { + newStatements[tableName].push(...preparedStatements); + } else { + newStatements[tableName] = preparedStatements; + } + + continue; + } + + continue; + } + + const tableName = statement.type === 'rename_table' + ? statement.tableNameTo + : (statement as { tableName: string }).tableName; + const statementsForTable = newStatements[tableName]; + + if (!statementsForTable) { + newStatements[tableName] = [statement]; + continue; + } + + if (!statementsForTable.some(({ type }) => type === 'recreate_table')) { + newStatements[tableName].push(statement); + } + } + + const combinedStatements = Object.values(newStatements).flat(); + const renamedTables = combinedStatements.filter((it) => it.type === 'rename_table'); + const renamedColumns = combinedStatements.filter((it) => it.type === 'alter_table_rename_column'); + + const rest = combinedStatements.filter((it) => it.type !== 'rename_table' && it.type !== 'alter_table_rename_column'); + + return [...renamedTables, ...renamedColumns, ...rest]; +}; + +export const sqliteCombineStatements = ( + statements: JsonStatement[], + json2: SQLiteSchemaSquashed, + action?: 'push', +) => { + // const tablesContext: Record = {}; + const newStatements: Record = {}; + for (const statement of statements) { + if ( + statement.type === 'alter_table_alter_column_set_type' + || statement.type === 'alter_table_alter_column_set_default' + || statement.type === 'alter_table_alter_column_drop_default' + || statement.type === 'alter_table_alter_column_set_notnull' + || statement.type === 'alter_table_alter_column_drop_notnull' + || statement.type === 'alter_table_alter_column_drop_autoincrement' + || statement.type === 'alter_table_alter_column_set_autoincrement' + || statement.type === 'alter_table_alter_column_drop_pk' + || statement.type === 'alter_table_alter_column_set_pk' + || statement.type === 'delete_reference' + || statement.type === 'alter_reference' + || statement.type === 'create_composite_pk' + || statement.type === 'alter_composite_pk' + || statement.type === 'delete_composite_pk' + || statement.type === 'create_unique_constraint' + || statement.type === 'delete_unique_constraint' + ) { + const tableName = statement.tableName; + + const statementsForTable = newStatements[tableName]; + + if (!statementsForTable) { + newStatements[tableName] = prepareLibSQLRecreateTable(json2.tables[tableName], action); + continue; + } + + if (!statementsForTable.some(({ type }) => type === 'recreate_table')) { + const wasRename = statementsForTable.some(({ type }) => type === 'rename_table'); + const preparedStatements = prepareLibSQLRecreateTable(json2.tables[tableName], action); + + if (wasRename) { + newStatements[tableName].push(...preparedStatements); + } else { + newStatements[tableName] = preparedStatements; + } + + continue; + } + + continue; + } + + if (statement.type === 'sqlite_alter_table_add_column' && statement.column.primaryKey) { + const tableName = statement.tableName; + + const statementsForTable = newStatements[tableName]; + + if (!statementsForTable) { + newStatements[tableName] = prepareLibSQLRecreateTable(json2.tables[tableName], action); + continue; + } + + if (!statementsForTable.some(({ type }) => type === 'recreate_table')) { + const wasRename = statementsForTable.some(({ type }) => type === 'rename_table'); + const preparedStatements = prepareLibSQLRecreateTable(json2.tables[tableName], action); + + if (wasRename) { + newStatements[tableName].push(...preparedStatements); + } else { + newStatements[tableName] = preparedStatements; + } + + continue; + } + + continue; + } + + if (statement.type === 'create_reference') { + const tableName = statement.tableName; + + const data = action === 'push' + ? SQLiteSquasher.unsquashPushFK(statement.data) + : SQLiteSquasher.unsquashFK(statement.data); + const statementsForTable = newStatements[tableName]; + + if (!statementsForTable) { + newStatements[tableName] = prepareSQLiteRecreateTable(json2.tables[tableName], action); + continue; + } + + // if add column with reference -> skip create_reference statement + if ( + data.columnsFrom.length === 1 + && statementsForTable.some((st) => + st.type === 'sqlite_alter_table_add_column' && st.column.name === data.columnsFrom[0] + ) + ) { + continue; + } + + if (!statementsForTable.some(({ type }) => type === 'recreate_table')) { + const wasRename = statementsForTable.some(({ type }) => type === 'rename_table'); + const preparedStatements = prepareLibSQLRecreateTable(json2.tables[tableName], action); + + if (wasRename) { + newStatements[tableName].push(...preparedStatements); + } else { + newStatements[tableName] = preparedStatements; + } + + continue; + } + + continue; + } + + const tableName = statement.type === 'rename_table' + ? statement.tableNameTo + : (statement as { tableName: string }).tableName; + + const statementsForTable = newStatements[tableName]; + + if (!statementsForTable) { + newStatements[tableName] = [statement]; + continue; + } + + if (!statementsForTable.some(({ type }) => type === 'recreate_table')) { + newStatements[tableName].push(statement); + } + } + + const combinedStatements = Object.values(newStatements).flat(); + + const renamedTables = combinedStatements.filter((it) => it.type === 'rename_table'); + const renamedColumns = combinedStatements.filter((it) => it.type === 'alter_table_rename_column'); + + const rest = combinedStatements.filter((it) => it.type !== 'rename_table' && it.type !== 'alter_table_rename_column'); + + return [...renamedTables, ...renamedColumns, ...rest]; +}; diff --git a/drizzle-kit/src/utils.ts b/drizzle-kit/src/utils.ts index 7b363a9d3..71454550e 100644 --- a/drizzle-kit/src/utils.ts +++ b/drizzle-kit/src/utils.ts @@ -9,6 +9,7 @@ import { assertUnreachable, snapshotVersion } from './global'; import type { Dialect } from './schemaValidator'; import { backwardCompatibleMysqlSchema } from './serializer/mysqlSchema'; import { backwardCompatiblePgSchema } from './serializer/pgSchema'; +import { backwardCompatibleSingleStoreSchema } from './serializer/singlestoreSchema'; import { backwardCompatibleSqliteSchema } from './serializer/sqliteSchema'; import type { ProxyParams } from './serializer/studio'; @@ -25,9 +26,12 @@ export type DB = { export type SQLiteDB = { query: (sql: string, params?: any[]) => Promise; run(query: string): Promise; - batch?( - queries: { query: string; values?: any[] | undefined }[], - ): Promise; +}; + +export type LibSQLDB = { + query: (sql: string, params?: any[]) => Promise; + run(query: string): Promise; + batchWithPragma?(queries: string[]): Promise; }; export const copy = (it: T): T => { @@ -115,8 +119,12 @@ const validatorForDialect = (dialect: Dialect) => { return { validator: backwardCompatiblePgSchema, version: 7 }; case 'sqlite': return { validator: backwardCompatibleSqliteSchema, version: 6 }; + case 'turso': + return { validator: backwardCompatibleSqliteSchema, version: 6 }; case 'mysql': return { validator: backwardCompatibleMysqlSchema, version: 5 }; + case 'singlestore': + return { validator: backwardCompatibleSingleStoreSchema, version: 1 }; } }; @@ -341,3 +349,13 @@ export const normalisePGliteUrl = ( export function isPgArrayType(sqlType: string) { return sqlType.match(/.*\[\d*\].*|.*\[\].*/g) !== null; } + +export function findAddedAndRemoved(columnNames1: string[], columnNames2: string[]) { + const set1 = new Set(columnNames1); + const set2 = new Set(columnNames2); + + const addedColumns = columnNames2.filter((it) => !set1.has(it)); + const removedColumns = columnNames1.filter((it) => !set2.has(it)); + + return { addedColumns, removedColumns }; +} diff --git a/drizzle-kit/tests/cli-generate.test.ts b/drizzle-kit/tests/cli-generate.test.ts index 3e5c0fc22..56a3a0d04 100644 --- a/drizzle-kit/tests/cli-generate.test.ts +++ b/drizzle-kit/tests/cli-generate.test.ts @@ -62,6 +62,7 @@ test('generate #2', async (t) => { test('generate #3', async (t) => { const res = await brotest(generate, ''); + if (res.type !== 'handler') assert.fail(res.type, 'handler'); expect(res.options).toStrictEqual({ dialect: 'postgresql', diff --git a/drizzle-kit/tests/cli-migrate.test.ts b/drizzle-kit/tests/cli-migrate.test.ts index a4ffec2f0..1425691f0 100644 --- a/drizzle-kit/tests/cli-migrate.test.ts +++ b/drizzle-kit/tests/cli-migrate.test.ts @@ -31,11 +31,10 @@ test('migrate #2', async (t) => { const res = await brotest(migrate, '--config=turso.config.ts'); if (res.type !== 'handler') assert.fail(res.type, 'handler'); expect(res.options).toStrictEqual({ - dialect: 'sqlite', + dialect: 'turso', out: 'drizzle', credentials: { authToken: 'token', - driver: 'turso', url: 'turso.dev', }, schema: undefined, // drizzle migrations table schema diff --git a/drizzle-kit/tests/cli-push.test.ts b/drizzle-kit/tests/cli-push.test.ts index 1a4bde66d..f5b84fdce 100644 --- a/drizzle-kit/tests/cli-push.test.ts +++ b/drizzle-kit/tests/cli-push.test.ts @@ -34,10 +34,9 @@ test('push #2', async (t) => { const res = await brotest(push, '--config=turso.config.ts'); if (res.type !== 'handler') assert.fail(res.type, 'handler'); expect(res.options).toStrictEqual({ - dialect: 'sqlite', + dialect: 'turso', credentials: { authToken: 'token', - driver: 'turso', url: 'turso.dev', }, force: false, diff --git a/drizzle-kit/tests/cli/turso.config.ts b/drizzle-kit/tests/cli/turso.config.ts index 089e4d216..85efe5934 100644 --- a/drizzle-kit/tests/cli/turso.config.ts +++ b/drizzle-kit/tests/cli/turso.config.ts @@ -2,8 +2,7 @@ import { defineConfig } from '../../src'; export default defineConfig({ schema: './schema.ts', - dialect: 'sqlite', - driver: 'turso', + dialect: 'turso', dbCredentials: { url: 'turso.dev', authToken: 'token', diff --git a/drizzle-kit/tests/libsql-statements.test.ts b/drizzle-kit/tests/libsql-statements.test.ts new file mode 100644 index 000000000..8221e52e0 --- /dev/null +++ b/drizzle-kit/tests/libsql-statements.test.ts @@ -0,0 +1,982 @@ +import { foreignKey, index, int, integer, sqliteTable, text, uniqueIndex } from 'drizzle-orm/sqlite-core'; +import { JsonRecreateTableStatement } from 'src/jsonStatements'; +import { expect, test } from 'vitest'; +import { diffTestSchemasLibSQL } from './schemaDiffer'; + +test('drop autoincrement', async (t) => { + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: false }), + }), + }; + + const { statements } = await diffTestSchemasLibSQL(schema1, schema2, []); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + columns: [{ + autoincrement: false, + generated: undefined, + name: 'id', + notNull: true, + primaryKey: true, + type: 'integer', + }], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); +}); + +test('set autoincrement', async (t) => { + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: false }), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + }), + }; + + const { statements } = await diffTestSchemasLibSQL(schema1, schema2, []); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + columns: [{ + autoincrement: true, + generated: undefined, + name: 'id', + notNull: true, + primaryKey: true, + type: 'integer', + }], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); +}); + +test('set not null', async (t) => { + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull(), + }), + }; + + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'alter_table_alter_column_set_notnull', + tableName: 'users', + columnName: 'name', + schema: '', + newDataType: 'text', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: true, + columnAutoIncrement: false, + columnPk: false, + }); + + expect(sqlStatements.length).toBe(1); + expect(sqlStatements[0]).toBe( + `ALTER TABLE \`users\` ALTER COLUMN "name" TO "name" text NOT NULL;`, + ); +}); + +test('drop not null', async (t) => { + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull(), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }), + }; + + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'alter_table_alter_column_drop_notnull', + tableName: 'users', + columnName: 'name', + schema: '', + newDataType: 'text', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + }); + + expect(sqlStatements.length).toBe(1); + expect(sqlStatements[0]).toBe( + `ALTER TABLE \`users\` ALTER COLUMN "name" TO "name" text;`, + ); +}); + +test('set default. set not null. add column', async (t) => { + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull().default('name'), + age: int('age').notNull(), + }), + }; + + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(3); + expect(statements[0]).toStrictEqual({ + type: 'alter_table_alter_column_set_default', + tableName: 'users', + columnName: 'name', + newDefaultValue: "'name'", + schema: '', + newDataType: 'text', + columnOnUpdate: undefined, + columnNotNull: true, + columnAutoIncrement: false, + columnPk: false, + }); + expect(statements[1]).toStrictEqual({ + type: 'alter_table_alter_column_set_notnull', + tableName: 'users', + columnName: 'name', + schema: '', + newDataType: 'text', + columnDefault: "'name'", + columnOnUpdate: undefined, + columnNotNull: true, + columnAutoIncrement: false, + columnPk: false, + }); + expect(statements[2]).toStrictEqual({ + type: 'sqlite_alter_table_add_column', + tableName: 'users', + referenceData: undefined, + column: { + name: 'age', + type: 'integer', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + }); + + expect(sqlStatements.length).toBe(2); + expect(sqlStatements[0]).toBe( + `ALTER TABLE \`users\` ALTER COLUMN "name" TO "name" text NOT NULL DEFAULT 'name';`, + ); + expect(sqlStatements[1]).toBe( + `ALTER TABLE \`users\` ADD \`age\` integer NOT NULL;`, + ); +}); + +test('drop default. drop not null', async (t) => { + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull().default('name'), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }), + }; + + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(2); + expect(statements[0]).toStrictEqual({ + type: 'alter_table_alter_column_drop_default', + tableName: 'users', + columnName: 'name', + schema: '', + newDataType: 'text', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + }); + expect(statements[1]).toStrictEqual({ + type: 'alter_table_alter_column_drop_notnull', + tableName: 'users', + columnName: 'name', + schema: '', + newDataType: 'text', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + }); + + expect(sqlStatements.length).toBe(1); + expect(sqlStatements[0]).toBe( + `ALTER TABLE \`users\` ALTER COLUMN "name" TO "name" text;`, + ); +}); + +test('set data type. set default', async (t) => { + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: int('name').default(123), + }), + }; + + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(2); + expect(statements[0]).toStrictEqual({ + type: 'alter_table_alter_column_set_type', + tableName: 'users', + columnName: 'name', + newDataType: 'integer', + oldDataType: 'text', + schema: '', + columnDefault: 123, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + }); + expect(statements[1]).toStrictEqual({ + type: 'alter_table_alter_column_set_default', + tableName: 'users', + columnName: 'name', + schema: '', + newDataType: 'integer', + newDefaultValue: 123, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + }); + + expect(sqlStatements.length).toBe(1); + expect(sqlStatements[0]).toBe( + `ALTER TABLE \`users\` ALTER COLUMN "name" TO "name" integer DEFAULT 123;`, + ); +}); + +test('add foriegn key', async (t) => { + const schema = { + table: sqliteTable('table', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }), + }; + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + tableId: int('table_id'), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + tableId: int('table_id').references(() => schema.table.id), + }), + }; + + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'create_reference', + tableName: 'users', + data: 'users_table_id_table_id_fk;users;table_id;table;id;no action;no action', + schema: '', + columnNotNull: false, + columnDefault: undefined, + columnType: 'integer', + }); + + expect(sqlStatements.length).toBe(1); + expect(sqlStatements[0]).toBe( + `ALTER TABLE \`users\` ALTER COLUMN "table_id" TO "table_id" integer REFERENCES table(id) ON DELETE no action ON UPDATE no action;`, + ); +}); + +test('drop foriegn key', async (t) => { + const schema = { + table: sqliteTable('table', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }), + }; + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + tableId: int('table_id').references(() => schema.table.id, { + onDelete: 'cascade', + }), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + tableId: int('table_id'), + }), + }; + + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + columns: [ + { + autoincrement: true, + generated: undefined, + name: 'id', + notNull: true, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + generated: undefined, + name: 'table_id', + notNull: false, + primaryKey: false, + type: 'integer', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements.length).toBe(6); + expect(sqlStatements[0]).toBe(`PRAGMA foreign_keys=OFF;`); + expect(sqlStatements[1]).toBe(`CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY AUTOINCREMENT NOT NULL, +\t\`table_id\` integer +);\n`); + expect(sqlStatements[2]).toBe( + `INSERT INTO \`__new_users\`("id", "table_id") SELECT "id", "table_id" FROM \`users\`;`, + ); + expect(sqlStatements[3]).toBe(`DROP TABLE \`users\`;`); + expect(sqlStatements[4]).toBe( + `ALTER TABLE \`__new_users\` RENAME TO \`users\`;`, + ); + expect(sqlStatements[5]).toBe(`PRAGMA foreign_keys=ON;`); +}); + +test('alter foriegn key', async (t) => { + const tableRef = sqliteTable('table', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }); + const tableRef2 = sqliteTable('table2', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }); + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + tableId: int('table_id').references(() => tableRef.id, { + onDelete: 'cascade', + }), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + tableId: int('table_id').references(() => tableRef2.id), + }), + }; + + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + columns: [ + { + autoincrement: true, + generated: undefined, + name: 'id', + notNull: true, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + generated: undefined, + name: 'table_id', + notNull: false, + primaryKey: false, + type: 'integer', + }, + ], + compositePKs: [], + referenceData: [ + { + columnsFrom: ['table_id'], + columnsTo: ['id'], + name: 'users_table_id_table2_id_fk', + onDelete: 'no action', + onUpdate: 'no action', + tableFrom: 'users', + tableTo: 'table2', + }, + ], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements.length).toBe(6); + expect(sqlStatements[0]).toBe(`PRAGMA foreign_keys=OFF;`); + expect(sqlStatements[1]).toBe(`CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY AUTOINCREMENT NOT NULL, +\t\`table_id\` integer, +\tFOREIGN KEY (\`table_id\`) REFERENCES \`table2\`(\`id\`) ON UPDATE no action ON DELETE no action +);\n`); + expect(sqlStatements[2]).toBe( + `INSERT INTO \`__new_users\`("id", "table_id") SELECT "id", "table_id" FROM \`users\`;`, + ); + expect(sqlStatements[3]).toBe( + 'DROP TABLE `users`;', + ); + expect(sqlStatements[4]).toBe( + 'ALTER TABLE `__new_users` RENAME TO `users`;', + ); + expect(sqlStatements[5]).toBe(`PRAGMA foreign_keys=ON;`); +}); + +test('add foriegn key for multiple columns', async (t) => { + const tableRef = sqliteTable('table', { + id: int('id').primaryKey({ autoIncrement: true }), + age: int('age'), + age1: int('age_1'), + }); + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + column: int('column'), + column1: int('column_1'), + }), + tableRef, + }; + + const schema2 = { + tableRef, + users: sqliteTable( + 'users', + { + id: int('id').primaryKey({ autoIncrement: true }), + column: int('column'), + column1: int('column_1'), + }, + (table) => ({ + foreignKey: foreignKey({ + columns: [table.column, table.column1], + foreignColumns: [tableRef.age, tableRef.age1], + }), + }), + ), + }; + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + columns: [ + { + autoincrement: true, + generated: undefined, + name: 'id', + notNull: true, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + generated: undefined, + name: 'column', + notNull: false, + primaryKey: false, + type: 'integer', + }, + { + autoincrement: false, + generated: undefined, + name: 'column_1', + notNull: false, + primaryKey: false, + type: 'integer', + }, + ], + compositePKs: [], + referenceData: [ + { + columnsFrom: ['column', 'column_1'], + columnsTo: ['age', 'age_1'], + name: 'users_column_column_1_table_age_age_1_fk', + onDelete: 'no action', + onUpdate: 'no action', + tableFrom: 'users', + tableTo: 'table', + }, + ], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + } as JsonRecreateTableStatement); + + expect(sqlStatements.length).toBe(6); + expect(sqlStatements[0]).toBe(`PRAGMA foreign_keys=OFF;`); + expect(sqlStatements[1]).toBe( + `CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY AUTOINCREMENT NOT NULL, +\t\`column\` integer, +\t\`column_1\` integer, +\tFOREIGN KEY (\`column\`,\`column_1\`) REFERENCES \`table\`(\`age\`,\`age_1\`) ON UPDATE no action ON DELETE no action +);\n`, + ); + expect(sqlStatements[2]).toBe( + `INSERT INTO \`__new_users\`("id", "column", "column_1") SELECT "id", "column", "column_1" FROM \`users\`;`, + ); + expect(sqlStatements[3]).toBe(`DROP TABLE \`users\`;`); + expect(sqlStatements[4]).toBe( + `ALTER TABLE \`__new_users\` RENAME TO \`users\`;`, + ); + expect(sqlStatements[5]).toBe(`PRAGMA foreign_keys=ON;`); +}); + +test('drop foriegn key for multiple columns', async (t) => { + const tableRef = sqliteTable('table', { + id: int('id').primaryKey({ autoIncrement: true }), + age: int('age'), + age1: int('age_1'), + }); + + const schema1 = { + users: sqliteTable( + 'users', + { + id: int('id').primaryKey({ autoIncrement: true }), + column: int('column'), + column1: int('column_1'), + }, + (table) => ({ + foreignKey: foreignKey({ + columns: [table.column, table.column1], + foreignColumns: [tableRef.age, tableRef.age1], + }), + }), + ), + tableRef, + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + column: int('column'), + column1: int('column_1'), + }), + tableRef, + }; + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + columns: [ + { + autoincrement: true, + generated: undefined, + name: 'id', + notNull: true, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + generated: undefined, + name: 'column', + notNull: false, + primaryKey: false, + type: 'integer', + }, + { + autoincrement: false, + generated: undefined, + name: 'column_1', + notNull: false, + primaryKey: false, + type: 'integer', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements.length).toBe(6); + expect(sqlStatements[0]).toBe(`PRAGMA foreign_keys=OFF;`); + expect(sqlStatements[1]).toBe( + `CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY AUTOINCREMENT NOT NULL, +\t\`column\` integer, +\t\`column_1\` integer +);\n`, + ); + expect(sqlStatements[2]).toBe( + `INSERT INTO \`__new_users\`("id", "column", "column_1") SELECT "id", "column", "column_1" FROM \`users\`;`, + ); + expect(sqlStatements[3]).toBe(`DROP TABLE \`users\`;`); + expect(sqlStatements[4]).toBe( + `ALTER TABLE \`__new_users\` RENAME TO \`users\`;`, + ); + expect(sqlStatements[5]).toBe(`PRAGMA foreign_keys=ON;`); +}); + +test('alter column drop generated', async (t) => { + const from = { + users: sqliteTable('table', { + id: int('id').primaryKey().notNull(), + name: text('name').generatedAlwaysAs('drizzle is the best').notNull(), + }), + }; + + const to = { + users: sqliteTable('table', { + id: int('id').primaryKey().notNull(), + name: text('name').notNull(), + }), + }; + + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + from, + to, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + columnAutoIncrement: false, + columnDefault: undefined, + columnGenerated: undefined, + columnName: 'name', + columnNotNull: true, + columnOnUpdate: undefined, + columnPk: false, + newDataType: 'text', + schema: '', + tableName: 'table', + type: 'alter_table_alter_column_drop_generated', + }); + + expect(sqlStatements.length).toBe(2); + expect(sqlStatements[0]).toBe(`ALTER TABLE \`table\` DROP COLUMN \`name\`;`); + expect(sqlStatements[1]).toBe( + `ALTER TABLE \`table\` ADD \`name\` text NOT NULL;`, + ); +}); + +test('recreate table with nested references', async (t) => { + let users = sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + age: integer('age'), + }); + let subscriptions = sqliteTable('subscriptions', { + id: int('id').primaryKey({ autoIncrement: true }), + userId: integer('user_id').references(() => users.id), + customerId: text('customer_id'), + }); + const schema1 = { + users: users, + subscriptions: subscriptions, + subscriptionMetadata: sqliteTable('subscriptions_metadata', { + id: int('id').primaryKey({ autoIncrement: true }), + subscriptionId: text('subscription_id').references( + () => subscriptions.id, + ), + }), + }; + + users = sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: false }), + name: text('name'), + age: integer('age'), + }); + const schema2 = { + users: users, + subscriptions: subscriptions, + subscriptionMetadata: sqliteTable('subscriptions_metadata', { + id: int('id').primaryKey({ autoIncrement: true }), + subscriptionId: text('subscription_id').references( + () => subscriptions.id, + ), + }), + }; + + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + columns: [ + { + autoincrement: false, + generated: undefined, + name: 'id', + notNull: true, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + generated: undefined, + name: 'name', + notNull: false, + primaryKey: false, + type: 'text', + }, + { + autoincrement: false, + generated: undefined, + name: 'age', + notNull: false, + primaryKey: false, + type: 'integer', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements.length).toBe(6); + expect(sqlStatements[0]).toBe(`PRAGMA foreign_keys=OFF;`); + expect(sqlStatements[1]).toBe(`CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY NOT NULL, +\t\`name\` text, +\t\`age\` integer +);\n`); + expect(sqlStatements[2]).toBe( + `INSERT INTO \`__new_users\`("id", "name", "age") SELECT "id", "name", "age" FROM \`users\`;`, + ); + expect(sqlStatements[3]).toBe(`DROP TABLE \`users\`;`); + expect(sqlStatements[4]).toBe( + `ALTER TABLE \`__new_users\` RENAME TO \`users\`;`, + ); + expect(sqlStatements[5]).toBe(`PRAGMA foreign_keys=ON;`); +}); + +test('set not null with index', async (t) => { + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }, (table) => ({ + someIndex: index('users_name_index').on(table.name), + })), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull(), + }, (table) => ({ + someIndex: index('users_name_index').on(table.name), + })), + }; + + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'alter_table_alter_column_set_notnull', + tableName: 'users', + columnName: 'name', + schema: '', + newDataType: 'text', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: true, + columnAutoIncrement: false, + columnPk: false, + }); + + expect(sqlStatements.length).toBe(3); + expect(sqlStatements[0]).toBe( + `DROP INDEX IF EXISTS "users_name_index";`, + ); + expect(sqlStatements[1]).toBe( + `ALTER TABLE \`users\` ALTER COLUMN "name" TO "name" text NOT NULL;`, + ); + expect(sqlStatements[2]).toBe( + `CREATE INDEX \`users_name_index\` ON \`users\` (\`name\`);`, + ); +}); + +test('drop not null with two indexes', async (t) => { + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull(), + age: int('age').notNull(), + }, (table) => ({ + someUniqeIndex: uniqueIndex('users_name_unique').on(table.name), + someIndex: index('users_age_index').on(table.age), + })), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + age: int('age').notNull(), + }, (table) => ({ + someUniqeIndex: uniqueIndex('users_name_unique').on(table.name), + someIndex: index('users_age_index').on(table.age), + })), + }; + + const { statements, sqlStatements } = await diffTestSchemasLibSQL( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'alter_table_alter_column_drop_notnull', + tableName: 'users', + columnName: 'name', + schema: '', + newDataType: 'text', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + }); + + expect(sqlStatements.length).toBe(5); + expect(sqlStatements[0]).toBe( + `DROP INDEX IF EXISTS "users_name_unique";`, + ); + expect(sqlStatements[1]).toBe( + `DROP INDEX IF EXISTS "users_age_index";`, + ); + expect(sqlStatements[2]).toBe( + `ALTER TABLE \`users\` ALTER COLUMN "name" TO "name" text;`, + ); + expect(sqlStatements[3]).toBe( + `CREATE UNIQUE INDEX \`users_name_unique\` ON \`users\` (\`name\`);`, + ); + expect(sqlStatements[4]).toBe( + `CREATE INDEX \`users_age_index\` ON \`users\` (\`age\`);`, + ); +}); diff --git a/drizzle-kit/tests/migrate/libsq-schema.ts b/drizzle-kit/tests/migrate/libsq-schema.ts new file mode 100644 index 000000000..5cb344d51 --- /dev/null +++ b/drizzle-kit/tests/migrate/libsq-schema.ts @@ -0,0 +1,6 @@ +import { integer, sqliteTable, text } from 'drizzle-orm/sqlite-core'; + +export const users = sqliteTable('users', { + id: integer('id').primaryKey().notNull(), + name: text('name').notNull(), +}); diff --git a/drizzle-kit/tests/migrate/libsql-migrate.test.ts b/drizzle-kit/tests/migrate/libsql-migrate.test.ts new file mode 100644 index 000000000..b937b644f --- /dev/null +++ b/drizzle-kit/tests/migrate/libsql-migrate.test.ts @@ -0,0 +1,58 @@ +import { createClient } from '@libsql/client'; +import { connectToLibSQL } from 'src/cli/connections'; +import { expect, test } from 'vitest'; + +test('validate migrate function', async () => { + const credentials = { + url: ':memory:', + }; + const { migrate, query } = await connectToLibSQL(credentials); + + await migrate({ migrationsFolder: 'tests/migrate/migrations' }); + + const res = await query(`PRAGMA table_info("users");`); + + expect(res).toStrictEqual([{ + cid: 0, + name: 'id', + type: 'INTEGER', + notnull: 0, + dflt_value: null, + pk: 0, + }, { + cid: 1, + name: 'name', + type: 'INTEGER', + notnull: 1, + dflt_value: null, + pk: 0, + }]); +}); + +// test('validate migrate function', async () => { +// const credentials = { +// url: '', +// authToken: '', +// }; +// const { migrate, query } = await connectToLibSQL(credentials); + +// await migrate({ migrationsFolder: 'tests/migrate/migrations' }); + +// const res = await query(`PRAGMA table_info("users");`); + +// expect(res).toStrictEqual([{ +// cid: 0, +// name: 'id', +// type: 'INTEGER', +// notnull: 0, +// dflt_value: null, +// pk: 0, +// }, { +// cid: 1, +// name: 'name', +// type: 'INTEGER', +// notnull: 1, +// dflt_value: null, +// pk: 0, +// }]); +// }); diff --git a/drizzle-kit/tests/migrate/migrations/0000_little_blizzard.sql b/drizzle-kit/tests/migrate/migrations/0000_little_blizzard.sql new file mode 100644 index 000000000..9de0a139d --- /dev/null +++ b/drizzle-kit/tests/migrate/migrations/0000_little_blizzard.sql @@ -0,0 +1,4 @@ +CREATE TABLE `users` ( + `id` integer PRIMARY KEY NOT NULL, + `name` text NOT NULL +); diff --git a/drizzle-kit/tests/migrate/migrations/0001_nebulous_storm.sql b/drizzle-kit/tests/migrate/migrations/0001_nebulous_storm.sql new file mode 100644 index 000000000..4309a05c2 --- /dev/null +++ b/drizzle-kit/tests/migrate/migrations/0001_nebulous_storm.sql @@ -0,0 +1,10 @@ +PRAGMA foreign_keys=OFF;--> statement-breakpoint +CREATE TABLE `__new_users` ( + `id` integer, + `name` integer NOT NULL +); +--> statement-breakpoint +INSERT INTO `__new_users`("id", "name") SELECT "id", "name" FROM `users`;--> statement-breakpoint +DROP TABLE `users`;--> statement-breakpoint +ALTER TABLE `__new_users` RENAME TO `users`;--> statement-breakpoint +PRAGMA foreign_keys=ON; \ No newline at end of file diff --git a/drizzle-kit/tests/migrate/migrations/meta/0000_snapshot.json b/drizzle-kit/tests/migrate/migrations/meta/0000_snapshot.json new file mode 100644 index 000000000..599d02b91 --- /dev/null +++ b/drizzle-kit/tests/migrate/migrations/meta/0000_snapshot.json @@ -0,0 +1,40 @@ +{ + "version": "6", + "dialect": "sqlite", + "id": "2bd46776-9e41-4a6c-b617-5c600bb176f2", + "prevId": "00000000-0000-0000-0000-000000000000", + "tables": { + "users": { + "name": "users", + "columns": { + "id": { + "name": "id", + "type": "integer", + "primaryKey": true, + "notNull": true, + "autoincrement": false + }, + "name": { + "name": "name", + "type": "text", + "primaryKey": false, + "notNull": true, + "autoincrement": false + } + }, + "indexes": {}, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {} + } + }, + "enums": {}, + "_meta": { + "schemas": {}, + "tables": {}, + "columns": {} + }, + "internal": { + "indexes": {} + } +} \ No newline at end of file diff --git a/drizzle-kit/tests/migrate/migrations/meta/0001_snapshot.json b/drizzle-kit/tests/migrate/migrations/meta/0001_snapshot.json new file mode 100644 index 000000000..e3b26ba14 --- /dev/null +++ b/drizzle-kit/tests/migrate/migrations/meta/0001_snapshot.json @@ -0,0 +1,40 @@ +{ + "version": "6", + "dialect": "sqlite", + "id": "6c0ec455-42fd-47fd-a22c-4bb4551e1358", + "prevId": "2bd46776-9e41-4a6c-b617-5c600bb176f2", + "tables": { + "users": { + "name": "users", + "columns": { + "id": { + "name": "id", + "type": "integer", + "primaryKey": false, + "notNull": false, + "autoincrement": false + }, + "name": { + "name": "name", + "type": "integer", + "primaryKey": false, + "notNull": true, + "autoincrement": false + } + }, + "indexes": {}, + "foreignKeys": {}, + "compositePrimaryKeys": {}, + "uniqueConstraints": {} + } + }, + "enums": {}, + "_meta": { + "schemas": {}, + "tables": {}, + "columns": {} + }, + "internal": { + "indexes": {} + } +} \ No newline at end of file diff --git a/drizzle-kit/tests/migrate/migrations/meta/_journal.json b/drizzle-kit/tests/migrate/migrations/meta/_journal.json new file mode 100644 index 000000000..c836eb194 --- /dev/null +++ b/drizzle-kit/tests/migrate/migrations/meta/_journal.json @@ -0,0 +1,20 @@ +{ + "version": "7", + "dialect": "sqlite", + "entries": [ + { + "idx": 0, + "version": "6", + "when": 1725358702427, + "tag": "0000_little_blizzard", + "breakpoints": true + }, + { + "idx": 1, + "version": "6", + "when": 1725358713033, + "tag": "0001_nebulous_storm", + "breakpoints": true + } + ] +} \ No newline at end of file diff --git a/drizzle-kit/tests/push/libsql.test.ts b/drizzle-kit/tests/push/libsql.test.ts new file mode 100644 index 000000000..89ec008ca --- /dev/null +++ b/drizzle-kit/tests/push/libsql.test.ts @@ -0,0 +1,1049 @@ +import { createClient } from '@libsql/client'; +import chalk from 'chalk'; +import { sql } from 'drizzle-orm'; +import { + blob, + foreignKey, + getTableConfig, + index, + int, + integer, + numeric, + real, + sqliteTable, + text, + uniqueIndex, +} from 'drizzle-orm/sqlite-core'; +import { diffTestSchemasPushLibSQL } from 'tests/schemaDiffer'; +import { expect, test } from 'vitest'; + +test('nothing changed in schema', async (t) => { + const turso = createClient({ + url: ':memory:', + }); + + const users = sqliteTable('users', { + id: integer('id').primaryKey().notNull(), + name: text('name').notNull(), + email: text('email'), + textJson: text('text_json', { mode: 'json' }), + blobJon: blob('blob_json', { mode: 'json' }), + blobBigInt: blob('blob_bigint', { mode: 'bigint' }), + numeric: numeric('numeric'), + createdAt: integer('created_at', { mode: 'timestamp' }), + createdAtMs: integer('created_at_ms', { mode: 'timestamp_ms' }), + real: real('real'), + text: text('text', { length: 255 }), + role: text('role', { enum: ['admin', 'user'] }).default('user'), + isConfirmed: integer('is_confirmed', { + mode: 'boolean', + }), + }); + + const schema1 = { + users, + + customers: sqliteTable('customers', { + id: integer('id').primaryKey(), + address: text('address').notNull(), + isConfirmed: integer('is_confirmed', { mode: 'boolean' }), + registrationDate: integer('registration_date', { mode: 'timestamp_ms' }) + .notNull() + .$defaultFn(() => new Date()), + userId: integer('user_id') + .references(() => users.id) + .notNull(), + }), + + posts: sqliteTable('posts', { + id: integer('id').primaryKey(), + content: text('content'), + authorId: integer('author_id'), + }), + }; + + const { + sqlStatements, + statements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushLibSQL(turso, schema1, schema1, [], false); + expect(sqlStatements.length).toBe(0); + expect(statements.length).toBe(0); + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); +}); + +test('added, dropped index', async (t) => { + const turso = createClient({ + url: ':memory:', + }); + + const users = sqliteTable('users', { + id: integer('id').primaryKey().notNull(), + name: text('name').notNull(), + email: text('email'), + textJson: text('text_json', { mode: 'json' }), + blobJon: blob('blob_json', { mode: 'json' }), + blobBigInt: blob('blob_bigint', { mode: 'bigint' }), + numeric: numeric('numeric'), + createdAt: integer('created_at', { mode: 'timestamp' }), + createdAtMs: integer('created_at_ms', { mode: 'timestamp_ms' }), + real: real('real'), + text: text('text', { length: 255 }), + role: text('role', { enum: ['admin', 'user'] }).default('user'), + isConfirmed: integer('is_confirmed', { + mode: 'boolean', + }), + }); + + const schema1 = { + users, + customers: sqliteTable( + 'customers', + { + id: integer('id').primaryKey(), + address: text('address').notNull(), + isConfirmed: integer('is_confirmed', { mode: 'boolean' }), + registrationDate: integer('registration_date', { mode: 'timestamp_ms' }) + .notNull() + .$defaultFn(() => new Date()), + userId: integer('user_id').notNull(), + }, + (table) => ({ + uniqueIndex: uniqueIndex('customers_address_unique').on(table.address), + }), + ), + + posts: sqliteTable('posts', { + id: integer('id').primaryKey(), + content: text('content'), + authorId: integer('author_id'), + }), + }; + + const schema2 = { + users, + customers: sqliteTable( + 'customers', + { + id: integer('id').primaryKey(), + address: text('address').notNull(), + isConfirmed: integer('is_confirmed', { mode: 'boolean' }), + registrationDate: integer('registration_date', { mode: 'timestamp_ms' }) + .notNull() + .$defaultFn(() => new Date()), + userId: integer('user_id').notNull(), + }, + (table) => ({ + uniqueIndex: uniqueIndex('customers_is_confirmed_unique').on( + table.isConfirmed, + ), + }), + ), + + posts: sqliteTable('posts', { + id: integer('id').primaryKey(), + content: text('content'), + authorId: integer('author_id'), + }), + }; + + const { + sqlStatements, + statements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushLibSQL(turso, schema1, schema2, [], false); + + expect(statements.length).toBe(2); + expect(statements[0]).toStrictEqual({ + type: 'drop_index', + tableName: 'customers', + data: 'customers_address_unique;address;true;', + schema: '', + }); + expect(statements[1]).toStrictEqual({ + type: 'create_index', + tableName: 'customers', + data: 'customers_is_confirmed_unique;is_confirmed;true;', + schema: '', + internal: { indexes: {} }, + }); + + expect(sqlStatements.length).toBe(2); + expect(sqlStatements[0]).toBe( + `DROP INDEX IF EXISTS \`customers_address_unique\`;`, + ); + expect(sqlStatements[1]).toBe( + `CREATE UNIQUE INDEX \`customers_is_confirmed_unique\` ON \`customers\` (\`is_confirmed\`);`, + ); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('added column not null and without default to table with data', async (t) => { + const turso = createClient({ + url: ':memory:', + }); + + const schema1 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey(), + name: text('name').notNull(), + }), + }; + + const schema2 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey(), + name: text('name').notNull(), + age: integer('age').notNull(), + }), + }; + + const table = getTableConfig(schema1.companies); + + const seedStatements = [ + `INSERT INTO \`${table.name}\` ("${schema1.companies.name.name}") VALUES ('drizzle');`, + `INSERT INTO \`${table.name}\` ("${schema1.companies.name.name}") VALUES ('turso');`, + ]; + + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushLibSQL( + turso, + schema1, + schema2, + [], + false, + seedStatements, + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'sqlite_alter_table_add_column', + tableName: 'companies', + column: { + name: 'age', + type: 'integer', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + referenceData: undefined, + }); + + expect(sqlStatements.length).toBe(2); + expect(sqlStatements[0]).toBe(`delete from companies;`); + expect(sqlStatements[1]).toBe( + `ALTER TABLE \`companies\` ADD \`age\` integer NOT NULL;`, + ); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(1); + expect(infoToPrint![0]).toBe( + `· You're about to add not-null ${ + chalk.underline( + 'age', + ) + } column without default value, which contains 2 items`, + ); + expect(shouldAskForApprove).toBe(true); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(1); + expect(tablesToTruncate![0]).toBe('companies'); +}); + +test('added column not null and without default to table without data', async (t) => { + const turso = createClient({ + url: ':memory:', + }); + + const schema1 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey(), + name: text('name').notNull(), + }), + }; + + const schema2 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey(), + name: text('name').notNull(), + age: integer('age').notNull(), + }), + }; + + const { + sqlStatements, + statements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushLibSQL(turso, schema1, schema2, [], false); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'sqlite_alter_table_add_column', + tableName: 'companies', + column: { + name: 'age', + type: 'integer', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + referenceData: undefined, + }); + + expect(sqlStatements.length).toBe(1); + expect(sqlStatements[0]).toBe( + `ALTER TABLE \`companies\` ADD \`age\` integer NOT NULL;`, + ); + + expect(infoToPrint!.length).toBe(0); + expect(columnsToRemove!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('drop autoincrement. drop column with data', async (t) => { + const turso = createClient({ + url: ':memory:', + }); + + const schema1 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }), + }; + + const schema2 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey({ autoIncrement: false }), + }), + }; + + const table = getTableConfig(schema1.companies); + const seedStatements = [ + `INSERT INTO \`${table.name}\` ("${schema1.companies.id.name}", "${schema1.companies.name.name}") VALUES (1, 'drizzle');`, + `INSERT INTO \`${table.name}\` ("${schema1.companies.id.name}", "${schema1.companies.name.name}") VALUES (2, 'turso');`, + ]; + + const { + sqlStatements, + statements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushLibSQL( + turso, + schema1, + schema2, + [], + false, + seedStatements, + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'recreate_table', + tableName: 'companies', + columns: [ + { + name: 'id', + type: 'integer', + autoincrement: false, + notNull: true, + primaryKey: true, + generated: undefined, + }, + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }); + + expect(sqlStatements.length).toBe(4); + expect(sqlStatements[0]).toBe( + `CREATE TABLE \`__new_companies\` ( +\t\`id\` integer PRIMARY KEY NOT NULL +);\n`, + ); + expect(sqlStatements[1]).toBe(`INSERT INTO \`__new_companies\`("id") SELECT "id" FROM \`companies\`;`); + expect(sqlStatements[2]).toBe(`DROP TABLE \`companies\`;`); + expect(sqlStatements[3]).toBe( + `ALTER TABLE \`__new_companies\` RENAME TO \`companies\`;`, + ); + + expect(columnsToRemove!.length).toBe(1); + expect(infoToPrint!.length).toBe(1); + expect(infoToPrint![0]).toBe( + `· You're about to delete ${ + chalk.underline( + 'name', + ) + } column in companies table with 2 items`, + ); + expect(shouldAskForApprove).toBe(true); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('change autoincrement. table is part of foreign key', async (t) => { + const turso = createClient({ + url: ':memory:', + }); + + const companies1 = sqliteTable('companies', { + id: integer('id').primaryKey({ autoIncrement: true }), + }); + const users1 = sqliteTable('users', { + id: integer('id').primaryKey({ autoIncrement: true }), + name: text('name').unique(), + companyId: integer('company_id').references(() => companies1.id), + }); + const schema1 = { + companies: companies1, + users: users1, + }; + + const companies2 = sqliteTable('companies', { + id: integer('id').primaryKey({ autoIncrement: false }), + }); + const users2 = sqliteTable('users', { + id: integer('id').primaryKey({ autoIncrement: true }), + name: text('name').unique(), + companyId: integer('company_id').references(() => companies2.id), + }); + const schema2 = { + companies: companies2, + users: users2, + }; + + const { name: usersTableName } = getTableConfig(users1); + const { name: companiesTableName } = getTableConfig(companies1); + const seedStatements = [ + `INSERT INTO \`${usersTableName}\` ("${schema1.users.name.name}") VALUES ('drizzle');`, + `INSERT INTO \`${usersTableName}\` ("${schema1.users.name.name}") VALUES ('turso');`, + `INSERT INTO \`${companiesTableName}\` ("${schema1.companies.id.name}") VALUES (1);`, + `INSERT INTO \`${companiesTableName}\` ("${schema1.companies.id.name}") VALUES (2);`, + ]; + + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushLibSQL( + turso, + schema1, + schema2, + [], + false, + seedStatements, + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'recreate_table', + tableName: 'companies', + columns: [ + { + name: 'id', + type: 'integer', + autoincrement: false, + notNull: true, + primaryKey: true, + generated: undefined, + }, + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }); + + expect(sqlStatements.length).toBe(4); + expect(sqlStatements[0]).toBe( + `CREATE TABLE \`__new_companies\` ( +\t\`id\` integer PRIMARY KEY NOT NULL +);\n`, + ); + expect(sqlStatements[1]).toBe( + `INSERT INTO \`__new_companies\`("id") SELECT "id" FROM \`companies\`;`, + ); + expect(sqlStatements[2]).toBe(`DROP TABLE \`companies\`;`); + expect(sqlStatements[3]).toBe( + `ALTER TABLE \`__new_companies\` RENAME TO \`companies\`;`, + ); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('drop not null, add not null', async (t) => { + const turso = createClient({ + url: ':memory:', + }); + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull(), + }), + posts: sqliteTable( + 'posts', + { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + userId: int('user_id'), + }, + ), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }), + posts: sqliteTable( + 'posts', + { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull(), + userId: int('user_id'), + }, + ), + }; + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushLibSQL( + turso, + schema1, + schema2, + [], + ); + + expect(statements!.length).toBe(2); + expect(statements![0]).toStrictEqual({ + columnAutoIncrement: false, + columnDefault: undefined, + columnName: 'name', + columnNotNull: false, + columnOnUpdate: undefined, + columnPk: false, + newDataType: 'text', + schema: '', + tableName: 'users', + type: 'alter_table_alter_column_drop_notnull', + }); + expect(statements![1]).toStrictEqual({ + columnAutoIncrement: false, + columnDefault: undefined, + columnName: 'name', + columnNotNull: true, + columnOnUpdate: undefined, + columnPk: false, + newDataType: 'text', + schema: '', + tableName: 'posts', + type: 'alter_table_alter_column_set_notnull', + }); + expect(sqlStatements!.length).toBe(2); + expect(sqlStatements![0]).toBe(`ALTER TABLE \`users\` ALTER COLUMN "name" TO "name" text;`); + expect(sqlStatements![1]).toBe(`ALTER TABLE \`posts\` ALTER COLUMN "name" TO "name" text NOT NULL;`); + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('drop table with data', async (t) => { + const turso = createClient({ + url: ':memory:', + }); + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull(), + }), + posts: sqliteTable( + 'posts', + { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + userId: int('user_id'), + }, + ), + }; + + const schema2 = { + posts: sqliteTable( + 'posts', + { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + userId: int('user_id'), + }, + ), + }; + + const seedStatements = [ + `INSERT INTO \`users\` ("name") VALUES ('drizzle')`, + ]; + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushLibSQL( + turso, + schema1, + schema2, + [], + false, + seedStatements, + ); + + expect(statements!.length).toBe(1); + expect(statements![0]).toStrictEqual({ + schema: undefined, + tableName: 'users', + type: 'drop_table', + }); + + expect(sqlStatements!.length).toBe(1); + expect(sqlStatements![0]).toBe(`DROP TABLE \`users\`;`); + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(1); + expect(infoToPrint![0]).toBe(`· You're about to delete ${chalk.underline('users')} table with 1 items`); + expect(shouldAskForApprove).toBe(true); + expect(tablesToRemove!.length).toBe(1); + expect(tablesToRemove![0]).toBe('users'); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('recreate table with nested references', async (t) => { + const turso = createClient({ + url: ':memory:', + }); + + let users = sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + age: integer('age'), + }); + let subscriptions = sqliteTable('subscriptions', { + id: int('id').primaryKey({ autoIncrement: true }), + userId: integer('user_id').references(() => users.id), + customerId: text('customer_id'), + }); + const schema1 = { + users: users, + subscriptions: subscriptions, + subscriptionMetadata: sqliteTable('subscriptions_metadata', { + id: int('id').primaryKey({ autoIncrement: true }), + subscriptionId: text('subscription_id').references( + () => subscriptions.id, + ), + }), + }; + + users = sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: false }), + name: text('name'), + age: integer('age'), + }); + const schema2 = { + users: users, + subscriptions: subscriptions, + subscriptionMetadata: sqliteTable('subscriptions_metadata', { + id: int('id').primaryKey({ autoIncrement: true }), + subscriptionId: text('subscription_id').references( + () => subscriptions.id, + ), + }), + }; + + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushLibSQL(turso, schema1, schema2, []); + + expect(statements!.length).toBe(1); + expect(statements![0]).toStrictEqual({ + columns: [ + { + autoincrement: false, + name: 'id', + notNull: true, + generated: undefined, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + name: 'name', + notNull: false, + generated: undefined, + primaryKey: false, + type: 'text', + }, + { + autoincrement: false, + name: 'age', + notNull: false, + generated: undefined, + primaryKey: false, + type: 'integer', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements!.length).toBe(4); + expect(sqlStatements![0]).toBe(`CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY NOT NULL, +\t\`name\` text, +\t\`age\` integer +);\n`); + expect(sqlStatements![1]).toBe( + `INSERT INTO \`__new_users\`("id", "name", "age") SELECT "id", "name", "age" FROM \`users\`;`, + ); + expect(sqlStatements![2]).toBe(`DROP TABLE \`users\`;`); + expect(sqlStatements![3]).toBe( + `ALTER TABLE \`__new_users\` RENAME TO \`users\`;`, + ); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('recreate table with added column not null and without default', async (t) => { + const turso = createClient({ + url: ':memory:', + }); + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + age: integer('age'), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: false }), + name: text('name'), + age: integer('age'), + newColumn: text('new_column').notNull(), + }), + }; + + const seedStatements = [ + `INSERT INTO \`users\` ("name", "age") VALUES ('drizzle', 12)`, + `INSERT INTO \`users\` ("name", "age") VALUES ('turso', 12)`, + ]; + + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushLibSQL( + turso, + schema1, + schema2, + [], + false, + seedStatements, + ); + + expect(statements!.length).toBe(1); + expect(statements![0]).toStrictEqual({ + columns: [ + { + autoincrement: false, + name: 'id', + notNull: true, + generated: undefined, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + name: 'name', + notNull: false, + generated: undefined, + primaryKey: false, + type: 'text', + }, + { + autoincrement: false, + name: 'age', + notNull: false, + generated: undefined, + primaryKey: false, + type: 'integer', + }, + { + autoincrement: false, + name: 'new_column', + notNull: true, + generated: undefined, + primaryKey: false, + type: 'text', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements!.length).toBe(4); + expect(sqlStatements[0]).toBe('DELETE FROM \`users\`;'); + expect(sqlStatements![1]).toBe(`CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY NOT NULL, +\t\`name\` text, +\t\`age\` integer, +\t\`new_column\` text NOT NULL +);\n`); + expect(sqlStatements![2]).toBe(`DROP TABLE \`users\`;`); + expect(sqlStatements![3]).toBe( + `ALTER TABLE \`__new_users\` RENAME TO \`users\`;`, + ); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(1); + expect(infoToPrint![0]).toBe( + `· You're about to add not-null ${ + chalk.underline('new_column') + } column without default value to table, which contains 2 items`, + ); + expect(shouldAskForApprove).toBe(true); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(1); + expect(tablesToTruncate![0]).toBe('users'); +}); + +test('set not null with index', async (t) => { + const turso = createClient({ + url: ':memory:', + }); + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }, (table) => ({ + someIndex: index('users_name_index').on(table.name), + })), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull(), + }, (table) => ({ + someIndex: index('users_name_index').on(table.name), + })), + }; + + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushLibSQL( + turso, + schema1, + schema2, + [], + ); + + expect(statements!.length).toBe(1); + expect(statements![0]).toStrictEqual({ + columnAutoIncrement: false, + columnDefault: undefined, + columnName: 'name', + columnNotNull: true, + columnOnUpdate: undefined, + columnPk: false, + newDataType: 'text', + schema: '', + tableName: 'users', + type: 'alter_table_alter_column_set_notnull', + }); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'alter_table_alter_column_set_notnull', + tableName: 'users', + columnName: 'name', + schema: '', + newDataType: 'text', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: true, + columnAutoIncrement: false, + columnPk: false, + }); + + expect(sqlStatements.length).toBe(3); + expect(sqlStatements[0]).toBe( + `DROP INDEX IF EXISTS "users_name_index";`, + ); + expect(sqlStatements[1]).toBe( + `ALTER TABLE \`users\` ALTER COLUMN "name" TO "name" text NOT NULL;`, + ); + expect(sqlStatements[2]).toBe( + `CREATE INDEX \`users_name_index\` ON \`users\` (\`name\`);`, + ); + expect(columnsToRemove!.length).toBe(0), expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('drop not null with two indexes', async (t) => { + const turso = createClient({ + url: ':memory:', + }); + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull(), + age: int('age').notNull(), + }, (table) => ({ + someUniqeIndex: uniqueIndex('users_name_unique').on(table.name), + someIndex: index('users_age_index').on(table.age), + })), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + age: int('age').notNull(), + }, (table) => ({ + someUniqeIndex: uniqueIndex('users_name_unique').on(table.name), + someIndex: index('users_age_index').on(table.age), + })), + }; + + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushLibSQL( + turso, + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'alter_table_alter_column_drop_notnull', + tableName: 'users', + columnName: 'name', + schema: '', + newDataType: 'text', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + }); + + expect(sqlStatements.length).toBe(5); + expect(sqlStatements[0]).toBe( + `DROP INDEX IF EXISTS "users_name_unique";`, + ); + expect(sqlStatements[1]).toBe( + `DROP INDEX IF EXISTS "users_age_index";`, + ); + expect(sqlStatements[2]).toBe( + `ALTER TABLE \`users\` ALTER COLUMN "name" TO "name" text;`, + ); + expect(sqlStatements[3]).toBe( + `CREATE UNIQUE INDEX \`users_name_unique\` ON \`users\` (\`name\`);`, + ); + expect(sqlStatements[4]).toBe( + `CREATE INDEX \`users_age_index\` ON \`users\` (\`age\`);`, + ); + expect(columnsToRemove!.length).toBe(0), expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); diff --git a/drizzle-kit/tests/push/sqlite.test.ts b/drizzle-kit/tests/push/sqlite.test.ts index cf468d3ec..aea5cd379 100644 --- a/drizzle-kit/tests/push/sqlite.test.ts +++ b/drizzle-kit/tests/push/sqlite.test.ts @@ -1,384 +1,630 @@ import Database from 'better-sqlite3'; -import { SQL, sql } from 'drizzle-orm'; -import { blob, foreignKey, int, integer, numeric, real, sqliteTable, text } from 'drizzle-orm/sqlite-core'; +import chalk from 'chalk'; +import { + blob, + foreignKey, + getTableConfig, + int, + integer, + numeric, + real, + sqliteTable, + text, + uniqueIndex, +} from 'drizzle-orm/sqlite-core'; import { diffTestSchemasPushSqlite } from 'tests/schemaDiffer'; import { expect, test } from 'vitest'; -import { DialectSuite, run } from './common'; - -const sqliteSuite: DialectSuite = { - addBasicIndexes: function(context?: any): Promise { - return {} as any; - }, - changeIndexFields: function(context?: any): Promise { - return {} as any; - }, - dropIndex: function(context?: any): Promise { - return {} as any; - }, - - async allTypes() { - const sqlite = new Database(':memory:'); - - const Users = sqliteTable('users', { - id: integer('id').primaryKey().notNull(), - name: text('name').notNull(), - email: text('email'), - textJson: text('text_json', { mode: 'json' }), - blobJon: blob('blob_json', { mode: 'json' }), - blobBigInt: blob('blob_bigint', { mode: 'bigint' }), - numeric: numeric('numeric'), - createdAt: integer('created_at', { mode: 'timestamp' }), - createdAtMs: integer('created_at_ms', { mode: 'timestamp_ms' }), - real: real('real'), - text: text('text', { length: 255 }), - role: text('role', { enum: ['admin', 'user'] }).default('user'), - isConfirmed: integer('is_confirmed', { - mode: 'boolean', - }), - }); - const schema1 = { - Users, +test('nothing changed in schema', async (t) => { + const client = new Database(':memory:'); + + const users = sqliteTable('users', { + id: integer('id').primaryKey().notNull(), + name: text('name').notNull(), + email: text('email'), + textJson: text('text_json', { mode: 'json' }), + blobJon: blob('blob_json', { mode: 'json' }), + blobBigInt: blob('blob_bigint', { mode: 'bigint' }), + numeric: numeric('numeric'), + createdAt: integer('created_at', { mode: 'timestamp' }), + createdAtMs: integer('created_at_ms', { mode: 'timestamp_ms' }), + real: real('real'), + text: text('text', { length: 255 }), + role: text('role', { enum: ['admin', 'user'] }).default('user'), + isConfirmed: integer('is_confirmed', { + mode: 'boolean', + }), + }); - Customers: sqliteTable('customers', { + const schema1 = { + users, + + customers: sqliteTable('customers', { + id: integer('id').primaryKey(), + address: text('address').notNull(), + isConfirmed: integer('is_confirmed', { mode: 'boolean' }), + registrationDate: integer('registration_date', { mode: 'timestamp_ms' }) + .notNull() + .$defaultFn(() => new Date()), + userId: integer('user_id') + .references(() => users.id) + .notNull(), + }), + + posts: sqliteTable('posts', { + id: integer('id').primaryKey(), + content: text('content'), + authorId: integer('author_id'), + }), + }; + + const { + sqlStatements, + statements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite(client, schema1, schema1, [], false); + expect(sqlStatements.length).toBe(0); + expect(statements.length).toBe(0); + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); +}); + +test('dropped, added unique index', async (t) => { + const client = new Database(':memory:'); + + const users = sqliteTable('users', { + id: integer('id').primaryKey().notNull(), + name: text('name').notNull(), + email: text('email'), + textJson: text('text_json', { mode: 'json' }), + blobJon: blob('blob_json', { mode: 'json' }), + blobBigInt: blob('blob_bigint', { mode: 'bigint' }), + numeric: numeric('numeric'), + createdAt: integer('created_at', { mode: 'timestamp' }), + createdAtMs: integer('created_at_ms', { mode: 'timestamp_ms' }), + real: real('real'), + text: text('text', { length: 255 }), + role: text('role', { enum: ['admin', 'user'] }).default('user'), + isConfirmed: integer('is_confirmed', { + mode: 'boolean', + }), + }); + + const schema1 = { + users, + + customers: sqliteTable( + 'customers', + { id: integer('id').primaryKey(), - address: text('address').notNull(), + address: text('address').notNull().unique(), isConfirmed: integer('is_confirmed', { mode: 'boolean' }), registrationDate: integer('registration_date', { mode: 'timestamp_ms' }) .notNull() .$defaultFn(() => new Date()), - userId: integer('user_id') - .references(() => Users.id) - .notNull(), + userId: integer('user_id').notNull(), + }, + (table) => ({ + uniqueIndex: uniqueIndex('customers_address_unique').on(table.address), }), + ), + + posts: sqliteTable('posts', { + id: integer('id').primaryKey(), + content: text('content'), + authorId: integer('author_id'), + }), + }; + + const schema2 = { + users, - Posts: sqliteTable('posts', { + customers: sqliteTable( + 'customers', + { id: integer('id').primaryKey(), - content: text('content'), - authorId: integer('author_id'), - }), - }; - - const { statements } = await diffTestSchemasPushSqlite( - sqlite, - schema1, - schema1, - [], - false, - ); - expect(statements.length).toBe(0); - }, - indexesToBeNotTriggered: function(context?: any): Promise { - return {} as any; - }, - indexesTestCase1: function(context?: any): Promise { - return {} as any; - }, - async case1(): Promise { - const sqlite = new Database(':memory:'); - - const schema1 = { - users: sqliteTable('users', { - id: text('id').notNull().primaryKey(), - firstName: text('first_name').notNull(), - lastName: text('last_name').notNull(), - username: text('username').notNull().unique(), - email: text('email').notNull().unique(), - password: text('password').notNull(), - avatarUrl: text('avatar_url').notNull(), - postsCount: integer('posts_count').notNull().default(0), - followersCount: integer('followers_count').notNull().default(0), - followingsCount: integer('followings_count').notNull().default(0), - createdAt: integer('created_at').notNull(), - }), - }; - - const schema2 = { - users: sqliteTable('users', { - id: text('id').notNull().primaryKey(), - firstName: text('first_name').notNull(), - lastName: text('last_name').notNull(), - username: text('username').notNull().unique(), - email: text('email').notNull().unique(), - password: text('password').notNull(), - avatarUrl: text('avatar_url').notNull(), - followersCount: integer('followers_count').notNull().default(0), - followingsCount: integer('followings_count').notNull().default(0), - createdAt: integer('created_at').notNull(), - }), - }; - - const { statements } = await diffTestSchemasPushSqlite( - sqlite, - schema1, - schema2, - [], - false, - ); - expect(statements.length).toBe(1); - expect(statements[0]).toStrictEqual({ - type: 'alter_table_drop_column', - tableName: 'users', - columnName: 'posts_count', - schema: '', - }); - }, - addNotNull: function(context?: any): Promise { - return {} as any; - }, - addNotNullWithDataNoRollback: function(context?: any): Promise { - return {} as any; - }, - addBasicSequences: function(context?: any): Promise { - return {} as any; - }, - // --- - addGeneratedColumn: async function(context?: any): Promise { - const sqlite = new Database(':memory:'); - - const from = { - users: sqliteTable('users', { - id: int('id'), - id2: int('id2'), - name: text('name'), - }), - }; - const to = { - users: sqliteTable('users', { - id: int('id'), - id2: int('id2'), - name: text('name'), - generatedName: text('gen_name').generatedAlwaysAs( - (): SQL => sql`${to.users.name} || 'hello'`, - { mode: 'stored' }, - ), - }), - }; - - const { statements, sqlStatements } = await diffTestSchemasPushSqlite( - sqlite, - from, - to, - [], - ); - - expect(statements).toStrictEqual([]); - expect(sqlStatements).toStrictEqual([]); - }, - addGeneratedToColumn: async function(context?: any): Promise { - const sqlite = new Database(':memory:'); - - const from = { - users: sqliteTable('users', { - id: int('id'), - id2: int('id2'), - name: text('name'), - generatedName: text('gen_name').notNull(), - generatedName1: text('gen_name1'), - }), - }; - const to = { - users: sqliteTable('users', { - id: int('id'), - id2: int('id2'), - name: text('name'), - generatedName: text('gen_name') + address: text('address').notNull(), + isConfirmed: integer('is_confirmed', { mode: 'boolean' }), + registrationDate: integer('registration_date', { mode: 'timestamp_ms' }) .notNull() - .generatedAlwaysAs((): SQL => sql`${to.users.name} || 'hello'`, { - mode: 'stored', - }), - generatedName1: text('gen_name1').generatedAlwaysAs( - (): SQL => sql`${to.users.name} || 'hello'`, - { mode: 'virtual' }, + .$defaultFn(() => new Date()), + userId: integer('user_id').notNull(), + }, + (table) => ({ + uniqueIndex: uniqueIndex('customers_is_confirmed_unique').on( + table.isConfirmed, ), }), - }; + ), + + posts: sqliteTable('posts', { + id: integer('id').primaryKey(), + content: text('content'), + authorId: integer('author_id'), + }), + }; + + const { + sqlStatements, + statements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite(client, schema1, schema2, [], false); + expect(statements.length).toBe(2); + expect(statements[0]).toStrictEqual({ + type: 'drop_index', + tableName: 'customers', + data: 'customers_address_unique;address;true;', + schema: '', + }); + expect(statements[1]).toStrictEqual({ + type: 'create_index', + tableName: 'customers', + data: 'customers_is_confirmed_unique;is_confirmed;true;', + schema: '', + internal: { + indexes: {}, + }, + }); + + expect(sqlStatements.length).toBe(2); + expect(sqlStatements[0]).toBe( + `DROP INDEX IF EXISTS \`customers_address_unique\`;`, + ); + expect(sqlStatements[1]).toBe( + `CREATE UNIQUE INDEX \`customers_is_confirmed_unique\` ON \`customers\` (\`is_confirmed\`);`, + ); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('added column not null and without default to table with data', async (t) => { + const client = new Database(':memory:'); + + const schema1 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey(), + name: text('name').notNull(), + }), + }; + + const schema2 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey(), + name: text('name').notNull(), + age: integer('age').notNull(), + }), + }; + + const table = getTableConfig(schema1.companies); + const seedStatements = [ + `INSERT INTO \`${table.name}\` ("${schema1.companies.name.name}") VALUES ('drizzle');`, + `INSERT INTO \`${table.name}\` ("${schema1.companies.name.name}") VALUES ('turso');`, + ]; + + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite( + client, + schema1, + schema2, + [], + false, + seedStatements, + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'sqlite_alter_table_add_column', + tableName: 'companies', + column: { + name: 'age', + type: 'integer', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + referenceData: undefined, + }); + expect(sqlStatements.length).toBe(2); + expect(sqlStatements[0]).toBe(`delete from companies;`); + expect(sqlStatements[1]).toBe( + `ALTER TABLE \`companies\` ADD \`age\` integer NOT NULL;`, + ); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(1); + expect(infoToPrint![0]).toBe( + `· You're about to add not-null ${ + chalk.underline( + 'age', + ) + } column without default value, which contains 2 items`, + ); + expect(shouldAskForApprove).toBe(true); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(1); + expect(tablesToTruncate![0]).toBe('companies'); +}); + +test('added column not null and without default to table without data', async (t) => { + const turso = new Database(':memory:'); + + const schema1 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey(), + name: text('name').notNull(), + }), + }; + + const schema2 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey(), + name: text('name').notNull(), + age: integer('age').notNull(), + }), + }; + + const { + sqlStatements, + statements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite(turso, schema1, schema2, [], false); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'sqlite_alter_table_add_column', + tableName: 'companies', + column: { + name: 'age', + type: 'integer', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + referenceData: undefined, + }); + + expect(sqlStatements.length).toBe(1); + expect(sqlStatements[0]).toBe( + `ALTER TABLE \`companies\` ADD \`age\` integer NOT NULL;`, + ); + + expect(infoToPrint!.length).toBe(0); + expect(columnsToRemove!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('drop autoincrement. drop column with data', async (t) => { + const turso = new Database(':memory:'); + + const schema1 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }), + }; + + const schema2 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey({ autoIncrement: false }), + }), + }; - const { statements, sqlStatements } = await diffTestSchemasPushSqlite( - sqlite, - from, - to, - [], - ); + const table = getTableConfig(schema1.companies); + const seedStatements = [ + `INSERT INTO \`${table.name}\` ("${schema1.companies.id.name}", "${schema1.companies.name.name}") VALUES (1, 'drizzle');`, + `INSERT INTO \`${table.name}\` ("${schema1.companies.id.name}", "${schema1.companies.name.name}") VALUES (2, 'turso');`, + ]; - expect(statements).toStrictEqual([ + const { + sqlStatements, + statements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite( + turso, + schema1, + schema2, + [], + false, + seedStatements, + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'recreate_table', + tableName: 'companies', + columns: [ { - columnAutoIncrement: false, - columnDefault: undefined, - columnGenerated: { - as: '("name" || \'hello\')', - type: 'virtual', - }, - columnName: 'gen_name1', - columnNotNull: false, - columnOnUpdate: undefined, - columnPk: false, - newDataType: 'text', - schema: '', - tableName: 'users', - type: 'alter_table_alter_column_set_generated', + name: 'id', + type: 'integer', + autoincrement: false, + notNull: true, + primaryKey: true, + generated: undefined, }, - ]); - expect(sqlStatements).toStrictEqual([ - 'ALTER TABLE `users` DROP COLUMN `gen_name1`;', - 'ALTER TABLE `users` ADD `gen_name1` text GENERATED ALWAYS AS ("name" || \'hello\') VIRTUAL;', - ]); - - for (const st of sqlStatements) { - sqlite.exec(st); - } - }, - dropGeneratedConstraint: async function(context?: any): Promise { - const sqlite = new Database(':memory:'); - - const from = { - users: sqliteTable('users', { - id: int('id'), - id2: int('id2'), - name: text('name'), - generatedName: text('gen_name').generatedAlwaysAs( - (): SQL => sql`${to.users.name} || 'hello'`, - { mode: 'stored' }, - ), - generatedName1: text('gen_name1').generatedAlwaysAs( - (): SQL => sql`${to.users.name} || 'hello'`, - { mode: 'virtual' }, - ), - }), - }; - const to = { - users: sqliteTable('users', { - id: int('id'), - id2: int('id2'), - name: text('name'), - generatedName: text('gen_name'), - generatedName1: text('gen_name1'), - }), - }; + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }); + + expect(sqlStatements.length).toBe(4); + expect(sqlStatements[0]).toBe( + `CREATE TABLE \`__new_companies\` ( +\t\`id\` integer PRIMARY KEY NOT NULL +);\n`, + ); + expect(sqlStatements[1]).toBe( + `INSERT INTO \`__new_companies\`("id") SELECT "id" FROM \`companies\`;`, + ); + expect(sqlStatements[2]).toBe(`DROP TABLE \`companies\`;`); + expect(sqlStatements[3]).toBe( + `ALTER TABLE \`__new_companies\` RENAME TO \`companies\`;`, + ); + + expect(columnsToRemove!.length).toBe(1); + expect(columnsToRemove![0]).toBe('name'); + expect(infoToPrint!.length).toBe(1); + expect(infoToPrint![0]).toBe( + `· You're about to delete ${ + chalk.underline( + 'name', + ) + } column in companies table with 2 items`, + ); + expect(shouldAskForApprove).toBe(true); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('drop autoincrement. drop column with data with pragma off', async (t) => { + const client = new Database(':memory:'); - const { statements, sqlStatements } = await diffTestSchemasPushSqlite( - sqlite, - from, - to, - [], - ); + client.exec('PRAGMA foreign_keys=OFF;'); - expect(statements).toStrictEqual([ + const users = sqliteTable('users', { + id: integer('id').primaryKey({ autoIncrement: true }), + }); + const schema1 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey({ autoIncrement: true }), + name: text('name'), + user_id: integer('user_id').references(() => users.id), + }), + }; + + const schema2 = { + companies: sqliteTable('companies', { + id: integer('id').primaryKey({ autoIncrement: false }), + user_id: integer('user_id').references(() => users.id), + }), + }; + + const table = getTableConfig(schema1.companies); + const seedStatements = [ + `INSERT INTO \`${table.name}\` ("${schema1.companies.id.name}", "${schema1.companies.name.name}") VALUES (1, 'drizzle');`, + `INSERT INTO \`${table.name}\` ("${schema1.companies.id.name}", "${schema1.companies.name.name}") VALUES (2, 'turso');`, + ]; + + const { + sqlStatements, + statements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite( + client, + schema1, + schema2, + [], + false, + seedStatements, + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'recreate_table', + tableName: 'companies', + columns: [ { - columnAutoIncrement: false, - columnDefault: undefined, - columnGenerated: undefined, - columnName: 'gen_name', - columnNotNull: false, - columnOnUpdate: undefined, - columnPk: false, - newDataType: 'text', - schema: '', - tableName: 'users', - type: 'alter_table_alter_column_drop_generated', + name: 'id', + type: 'integer', + autoincrement: false, + notNull: true, + primaryKey: true, + generated: undefined, }, { - columnAutoIncrement: false, - columnDefault: undefined, - columnGenerated: undefined, - columnName: 'gen_name1', - columnNotNull: false, - columnOnUpdate: undefined, - columnPk: false, - newDataType: 'text', - schema: '', - tableName: 'users', - type: 'alter_table_alter_column_drop_generated', + name: 'user_id', + type: 'integer', + autoincrement: false, + notNull: false, + primaryKey: false, + generated: undefined, }, - ]); - expect(sqlStatements).toStrictEqual([ - 'ALTER TABLE `users` DROP COLUMN `gen_name`;', - 'ALTER TABLE `users` ADD `gen_name` text;', - 'ALTER TABLE `users` DROP COLUMN `gen_name1`;', - 'ALTER TABLE `users` ADD `gen_name1` text;', - ]); - - for (const st of sqlStatements) { - sqlite.exec(st); - } - }, - alterGeneratedConstraint: async function(context?: any): Promise { - const sqlite = new Database(':memory:'); - - const from = { - users: sqliteTable('users', { - id: int('id'), - id2: int('id2'), - name: text('name'), - generatedName: text('gen_name').generatedAlwaysAs( - (): SQL => sql`${to.users.name} || 'hello'`, - { mode: 'stored' }, - ), - generatedName1: text('gen_name1').generatedAlwaysAs( - (): SQL => sql`${to.users.name} || 'hello'`, - { mode: 'virtual' }, - ), - }), - }; - const to = { - users: sqliteTable('users', { - id: int('id'), - id2: int('id2'), - name: text('name'), - generatedName: text('gen_name').generatedAlwaysAs( - (): SQL => sql`${to.users.name}`, - { mode: 'stored' }, - ), - generatedName1: text('gen_name1').generatedAlwaysAs( - (): SQL => sql`${to.users.name}`, - { mode: 'virtual' }, - ), - }), - }; + ], + compositePKs: [], + referenceData: [ + { + columnsFrom: [ + 'user_id', + ], + columnsTo: [ + 'id', + ], + name: '', + onDelete: 'no action', + onUpdate: 'no action', + tableFrom: 'companies', + tableTo: 'users', + }, + ], + uniqueConstraints: [], + }); + + expect(sqlStatements.length).toBe(4); + expect(sqlStatements[0]).toBe( + `CREATE TABLE \`__new_companies\` ( +\t\`id\` integer PRIMARY KEY NOT NULL, +\t\`user_id\` integer, +\tFOREIGN KEY (\`user_id\`) REFERENCES \`users\`(\`id\`) ON UPDATE no action ON DELETE no action +);\n`, + ); + expect(sqlStatements[1]).toBe( + `INSERT INTO \`__new_companies\`("id", "user_id") SELECT "id", "user_id" FROM \`companies\`;`, + ); + expect(sqlStatements[2]).toBe(`DROP TABLE \`companies\`;`); + expect(sqlStatements[3]).toBe( + `ALTER TABLE \`__new_companies\` RENAME TO \`companies\`;`, + ); + + expect(columnsToRemove!.length).toBe(1); + expect(infoToPrint!.length).toBe(1); + expect(infoToPrint![0]).toBe( + `· You're about to delete ${ + chalk.underline( + 'name', + ) + } column in companies table with 2 items`, + ); + expect(shouldAskForApprove).toBe(true); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('change autoincrement. other table references current', async (t) => { + const client = new Database(':memory:'); - const { statements, sqlStatements } = await diffTestSchemasPushSqlite( - sqlite, - from, - to, - [], - ); + const companies1 = sqliteTable('companies', { + id: integer('id').primaryKey({ autoIncrement: true }), + }); + const users1 = sqliteTable('users', { + id: integer('id').primaryKey({ autoIncrement: true }), + name: text('name').unique(), + companyId: text('company_id').references(() => companies1.id), + }); + const schema1 = { + companies: companies1, + users: users1, + }; - expect(statements).toStrictEqual([ + const companies2 = sqliteTable('companies', { + id: integer('id').primaryKey({ autoIncrement: false }), + }); + const users2 = sqliteTable('users', { + id: integer('id').primaryKey({ autoIncrement: true }), + name: text('name').unique(), + companyId: text('company_id').references(() => companies1.id), + }); + const schema2 = { + companies: companies2, + users: users2, + }; + + const { name: usersTableName } = getTableConfig(users1); + const { name: companiesTableName } = getTableConfig(companies1); + const seedStatements = [ + `INSERT INTO \`${usersTableName}\` ("${schema1.users.name.name}") VALUES ('drizzle');`, + `INSERT INTO \`${usersTableName}\` ("${schema1.users.name.name}") VALUES ('turso');`, + `INSERT INTO \`${companiesTableName}\` ("${schema1.companies.id.name}") VALUES ('1');`, + `INSERT INTO \`${companiesTableName}\` ("${schema1.companies.id.name}") VALUES ('2');`, + ]; + + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite( + client, + schema1, + schema2, + [], + false, + seedStatements, + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + type: 'recreate_table', + tableName: 'companies', + columns: [ { - columnAutoIncrement: false, - columnDefault: undefined, - columnGenerated: { - as: '("name")', - type: 'virtual', - }, - columnName: 'gen_name1', - columnNotNull: false, - columnOnUpdate: undefined, - columnPk: false, - newDataType: 'text', - schema: '', - tableName: 'users', - type: 'alter_table_alter_column_alter_generated', + name: 'id', + type: 'integer', + autoincrement: false, + notNull: true, + primaryKey: true, + generated: undefined, }, - ]); - expect(sqlStatements).toStrictEqual([ - 'ALTER TABLE `users` DROP COLUMN `gen_name1`;', - 'ALTER TABLE `users` ADD `gen_name1` text GENERATED ALWAYS AS ("name") VIRTUAL;', - ]); - - for (const st of sqlStatements) { - sqlite.exec(st); - } - }, - createTableWithGeneratedConstraint: function(context?: any): Promise { - return {} as any; - }, -}; - -run(sqliteSuite); + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }); + + expect(sqlStatements.length).toBe(6); + expect(sqlStatements[0]).toBe(`PRAGMA foreign_keys=OFF;`); + expect(sqlStatements[1]).toBe( + `CREATE TABLE \`__new_companies\` ( +\t\`id\` integer PRIMARY KEY NOT NULL +);\n`, + ); + expect(sqlStatements[2]).toBe( + `INSERT INTO \`__new_companies\`("id") SELECT "id" FROM \`companies\`;`, + ); + expect(sqlStatements[3]).toBe(`DROP TABLE \`companies\`;`); + expect(sqlStatements[4]).toBe( + `ALTER TABLE \`__new_companies\` RENAME TO \`companies\`;`, + ); + expect(sqlStatements[5]).toBe(`PRAGMA foreign_keys=ON;`); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); test('create table with custom name references', async (t) => { - const sqlite = new Database(':memory:'); + const client = new Database(':memory:'); const users = sqliteTable('users', { id: int('id').primaryKey({ autoIncrement: true }), @@ -424,7 +670,7 @@ test('create table with custom name references', async (t) => { }; const { sqlStatements } = await diffTestSchemasPushSqlite( - sqlite, + client, schema1, schema2, [], @@ -432,3 +678,613 @@ test('create table with custom name references', async (t) => { expect(sqlStatements!.length).toBe(0); }); + +test('drop not null, add not null', async (t) => { + const client = new Database(':memory:'); + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull(), + }), + posts: sqliteTable('posts', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + userId: int('user_id'), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }), + posts: sqliteTable('posts', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name').notNull(), + userId: int('user_id'), + }), + }; + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite(client, schema1, schema2, []); + + expect(statements!.length).toBe(2); + expect(statements![0]).toStrictEqual({ + columns: [ + { + autoincrement: true, + generated: undefined, + name: 'id', + notNull: true, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + generated: undefined, + name: 'name', + notNull: false, + primaryKey: false, + type: 'text', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); + expect(statements![1]).toStrictEqual({ + columns: [ + { + autoincrement: true, + generated: undefined, + name: 'id', + notNull: true, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + generated: undefined, + name: 'name', + notNull: true, + primaryKey: false, + type: 'text', + }, + { + autoincrement: false, + generated: undefined, + name: 'user_id', + notNull: false, + primaryKey: false, + type: 'integer', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'posts', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements.length).toBe(8); + expect(sqlStatements[0]).toBe(`CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY AUTOINCREMENT NOT NULL, +\t\`name\` text +);\n`); + expect(sqlStatements[1]).toBe( + `INSERT INTO \`__new_users\`("id", "name") SELECT "id", "name" FROM \`users\`;`, + ); + expect(sqlStatements[2]).toBe(`DROP TABLE \`users\`;`); + expect(sqlStatements[3]).toBe( + `ALTER TABLE \`__new_users\` RENAME TO \`users\`;`, + ); + + expect(sqlStatements![4]).toBe(`CREATE TABLE \`__new_posts\` ( +\t\`id\` integer PRIMARY KEY AUTOINCREMENT NOT NULL, +\t\`name\` text NOT NULL, +\t\`user_id\` integer +);\n`); + expect(sqlStatements![5]).toBe( + `INSERT INTO \`__new_posts\`("id", "name", "user_id") SELECT "id", "name", "user_id" FROM \`posts\`;`, + ); + expect(sqlStatements![6]).toBe(`DROP TABLE \`posts\`;`); + expect(sqlStatements![7]).toBe( + `ALTER TABLE \`__new_posts\` RENAME TO \`posts\`;`, + ); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('rename table and change data type', async (t) => { + const client = new Database(':memory:'); + + const schema1 = { + users: sqliteTable('old_users', { + id: int('id').primaryKey({ autoIncrement: true }), + age: text('age'), + }), + }; + + const schema2 = { + users: sqliteTable('new_users', { + id: int('id').primaryKey({ autoIncrement: true }), + age: integer('age'), + }), + }; + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite(client, schema1, schema2, [ + 'public.old_users->public.new_users', + ]); + + expect(statements!.length).toBe(2); + expect(statements![0]).toStrictEqual({ + fromSchema: undefined, + tableNameFrom: 'old_users', + tableNameTo: 'new_users', + toSchema: undefined, + type: 'rename_table', + }); + expect(statements![1]).toStrictEqual({ + columns: [ + { + autoincrement: true, + name: 'id', + notNull: true, + generated: undefined, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + name: 'age', + notNull: false, + generated: undefined, + primaryKey: false, + type: 'integer', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'new_users', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements!.length).toBe(5); + expect(sqlStatements![0]).toBe( + `ALTER TABLE \`old_users\` RENAME TO \`new_users\`;`, + ); + expect(sqlStatements[1]).toBe(`CREATE TABLE \`__new_new_users\` ( +\t\`id\` integer PRIMARY KEY AUTOINCREMENT NOT NULL, +\t\`age\` integer +);\n`); + expect(sqlStatements![2]).toBe( + `INSERT INTO \`__new_new_users\`("id", "age") SELECT "id", "age" FROM \`new_users\`;`, + ); + expect(sqlStatements![3]).toBe(`DROP TABLE \`new_users\`;`); + expect(sqlStatements![4]).toBe( + `ALTER TABLE \`__new_new_users\` RENAME TO \`new_users\`;`, + ); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('rename column and change data type', async (t) => { + const client = new Database(':memory:'); + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + age: integer('age'), + }), + }; + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite(client, schema1, schema2, [ + 'public.users.name->public.users.age', + ]); + + expect(statements!.length).toBe(1); + expect(statements![0]).toStrictEqual({ + columns: [ + { + autoincrement: true, + name: 'id', + notNull: true, + generated: undefined, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + name: 'age', + notNull: false, + generated: undefined, + primaryKey: false, + type: 'integer', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements!.length).toBe(4); + expect(sqlStatements![0]).toBe(`CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY AUTOINCREMENT NOT NULL, +\t\`age\` integer +);\n`); + expect(sqlStatements![1]).toBe( + `INSERT INTO \`__new_users\`("id", "age") SELECT "id", "age" FROM \`users\`;`, + ); + expect(sqlStatements![2]).toBe(`DROP TABLE \`users\`;`); + expect(sqlStatements![3]).toBe( + `ALTER TABLE \`__new_users\` RENAME TO \`users\`;`, + ); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('recreate table with nested references', async (t) => { + const client = new Database(':memory:'); + + let users = sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + age: integer('age'), + }); + let subscriptions = sqliteTable('subscriptions', { + id: int('id').primaryKey({ autoIncrement: true }), + userId: integer('user_id').references(() => users.id), + customerId: text('customer_id'), + }); + const schema1 = { + users: users, + subscriptions: subscriptions, + subscriptionMetadata: sqliteTable('subscriptions_metadata', { + id: int('id').primaryKey({ autoIncrement: true }), + subscriptionId: text('subscription_id').references( + () => subscriptions.id, + ), + }), + }; + + users = sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: false }), + name: text('name'), + age: integer('age'), + }); + const schema2 = { + users: users, + subscriptions: subscriptions, + subscriptionMetadata: sqliteTable('subscriptions_metadata', { + id: int('id').primaryKey({ autoIncrement: true }), + subscriptionId: text('subscription_id').references( + () => subscriptions.id, + ), + }), + }; + + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite(client, schema1, schema2, [ + 'public.users.name->public.users.age', + ]); + + expect(statements!.length).toBe(1); + expect(statements![0]).toStrictEqual({ + columns: [ + { + autoincrement: false, + name: 'id', + notNull: true, + generated: undefined, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + name: 'name', + notNull: false, + generated: undefined, + primaryKey: false, + type: 'text', + }, + { + autoincrement: false, + name: 'age', + notNull: false, + generated: undefined, + primaryKey: false, + type: 'integer', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements!.length).toBe(6); + expect(sqlStatements[0]).toBe('PRAGMA foreign_keys=OFF;'); + expect(sqlStatements![1]).toBe(`CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY NOT NULL, +\t\`name\` text, +\t\`age\` integer +);\n`); + expect(sqlStatements![2]).toBe( + `INSERT INTO \`__new_users\`("id", "name", "age") SELECT "id", "name", "age" FROM \`users\`;`, + ); + expect(sqlStatements![3]).toBe(`DROP TABLE \`users\`;`); + expect(sqlStatements![4]).toBe( + `ALTER TABLE \`__new_users\` RENAME TO \`users\`;`, + ); + expect(sqlStatements[5]).toBe('PRAGMA foreign_keys=ON;'); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); + +test('recreate table with added column not null and without default with data', async (t) => { + const client = new Database(':memory:'); + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + age: integer('age'), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: false }), + name: text('name'), + age: integer('age'), + newColumn: text('new_column').notNull(), + }), + }; + + const seedStatements = [ + `INSERT INTO \`users\` ("name", "age") VALUES ('drizzle', 12)`, + `INSERT INTO \`users\` ("name", "age") VALUES ('turso', 12)`, + ]; + + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite( + client, + schema1, + schema2, + [], + false, + seedStatements, + ); + + expect(statements!.length).toBe(1); + expect(statements![0]).toStrictEqual({ + columns: [ + { + autoincrement: false, + name: 'id', + notNull: true, + generated: undefined, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + name: 'name', + notNull: false, + generated: undefined, + primaryKey: false, + type: 'text', + }, + { + autoincrement: false, + name: 'age', + notNull: false, + generated: undefined, + primaryKey: false, + type: 'integer', + }, + { + autoincrement: false, + name: 'new_column', + notNull: true, + generated: undefined, + primaryKey: false, + type: 'text', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements!.length).toBe(4); + expect(sqlStatements[0]).toBe('DELETE FROM \`users\`;'); + expect(sqlStatements![1]).toBe(`CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY NOT NULL, +\t\`name\` text, +\t\`age\` integer, +\t\`new_column\` text NOT NULL +);\n`); + expect(sqlStatements![2]).toBe(`DROP TABLE \`users\`;`); + expect(sqlStatements![3]).toBe( + `ALTER TABLE \`__new_users\` RENAME TO \`users\`;`, + ); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(1); + expect(infoToPrint![0]).toBe( + `· You're about to add not-null ${ + chalk.underline('new_column') + } column without default value to table, which contains 2 items`, + ); + expect(shouldAskForApprove).toBe(true); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(1); + expect(tablesToTruncate![0]).toBe('users'); +}); + +test('recreate table with added column not null and without default with data', async (t) => { + const client = new Database(':memory:'); + + const schema1 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + age: integer('age'), + }), + }; + + const schema2 = { + users: sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: false }), + name: text('name'), + age: integer('age'), + newColumn: text('new_column').notNull(), + }), + }; + + const { + statements, + sqlStatements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await diffTestSchemasPushSqlite( + client, + schema1, + schema2, + [], + ); + + expect(statements!.length).toBe(1); + expect(statements![0]).toStrictEqual({ + columns: [ + { + autoincrement: false, + name: 'id', + notNull: true, + generated: undefined, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + name: 'name', + notNull: false, + generated: undefined, + primaryKey: false, + type: 'text', + }, + { + autoincrement: false, + name: 'age', + notNull: false, + generated: undefined, + primaryKey: false, + type: 'integer', + }, + { + autoincrement: false, + name: 'new_column', + notNull: true, + generated: undefined, + primaryKey: false, + type: 'text', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements!.length).toBe(4); + expect(sqlStatements![0]).toBe(`CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY NOT NULL, +\t\`name\` text, +\t\`age\` integer, +\t\`new_column\` text NOT NULL +);\n`); + expect(sqlStatements[1]).toBe( + 'INSERT INTO `__new_users`("id", "name", "age", "new_column") SELECT "id", "name", "age", "new_column" FROM `users`;', + ); + expect(sqlStatements![2]).toBe(`DROP TABLE \`users\`;`); + expect(sqlStatements![3]).toBe( + `ALTER TABLE \`__new_users\` RENAME TO \`users\`;`, + ); + + expect(columnsToRemove!.length).toBe(0); + expect(infoToPrint!.length).toBe(0); + expect(shouldAskForApprove).toBe(false); + expect(tablesToRemove!.length).toBe(0); + expect(tablesToTruncate!.length).toBe(0); +}); diff --git a/drizzle-kit/tests/schemaDiffer.ts b/drizzle-kit/tests/schemaDiffer.ts index 58f6a8f64..3223ca5e7 100644 --- a/drizzle-kit/tests/schemaDiffer.ts +++ b/drizzle-kit/tests/schemaDiffer.ts @@ -1,13 +1,14 @@ import { PGlite } from '@electric-sql/pglite'; +import { Client } from '@libsql/client/.'; import { Database } from 'better-sqlite3'; import { is } from 'drizzle-orm'; import { MySqlSchema, MySqlTable } from 'drizzle-orm/mysql-core'; import { isPgEnum, isPgSequence, PgEnum, PgSchema, PgSequence, PgTable } from 'drizzle-orm/pg-core'; -import { SingleStoreSchema } from 'drizzle-orm/singlestore-core'; -import { SingleStoreTable } from 'drizzle-orm/singlestore-core'; +import { SingleStoreSchema, SingleStoreTable } from 'drizzle-orm/singlestore-core'; import { SQLiteTable } from 'drizzle-orm/sqlite-core'; import * as fs from 'fs'; import { Connection } from 'mysql2/promise'; +import { libSqlLogSuggestionsAndReturn } from 'src/cli/commands/libSqlPushUtils'; import { columnsResolver, enumsResolver, @@ -37,6 +38,7 @@ import { prepareFromSqliteImports } from 'src/serializer/sqliteImports'; import { sqliteSchema, squashSqliteScheme } from 'src/serializer/sqliteSchema'; import { fromDatabase as fromSqliteDatabase, generateSqliteSnapshot } from 'src/serializer/sqliteSerializer'; import { + applyLibSQLSnapshotsDiff, applyMysqlSnapshotsDiff, applyPgSnapshotsDiff, applySingleStoreSnapshotsDiff, @@ -959,11 +961,18 @@ export const diffTestSchemasPushSqlite = async ( right: SqliteSchema, renamesArr: string[], cli: boolean = false, + seedStatements: string[] = [], ) => { const { sqlStatements } = await applySqliteDiffs(left, 'push'); + for (const st of sqlStatements) { client.exec(st); } + + for (const st of seedStatements) { + client.exec(st); + } + // do introspect into PgSchemaInternal const introspectedSchema = await fromSqliteDatabase( { @@ -977,9 +986,9 @@ export const diffTestSchemasPushSqlite = async ( undefined, ); - const leftTables = Object.values(right).filter((it) => is(it, SQLiteTable)) as SQLiteTable[]; + const rightTables = Object.values(right).filter((it) => is(it, SQLiteTable)) as SQLiteTable[]; - const serialized2 = generateSqliteSnapshot(leftTables); + const serialized2 = generateSqliteSnapshot(rightTables); const { version: v1, dialect: d1, ...rest1 } = introspectedSchema; const { version: v2, dialect: d2, ...rest2 } = serialized2; @@ -1016,7 +1025,15 @@ export const diffTestSchemasPushSqlite = async ( 'push', ); - const { statementsToExecute } = await logSuggestionsAndReturn( + const { + statementsToExecute, + columnsToRemove, + infoToPrint, + schemasToRemove, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await logSuggestionsAndReturn( { query: async (sql: string, params: any[] = []) => { return client.prepare(sql).bind(params).all() as T[]; @@ -1031,7 +1048,16 @@ export const diffTestSchemasPushSqlite = async ( _meta!, ); - return { sqlStatements: statementsToExecute, statements }; + return { + sqlStatements: statementsToExecute, + statements, + columnsToRemove, + infoToPrint, + schemasToRemove, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + }; } else { const { sqlStatements, statements } = await applySqliteSnapshotsDiff( sn1, @@ -1046,6 +1072,122 @@ export const diffTestSchemasPushSqlite = async ( } }; +export async function diffTestSchemasPushLibSQL( + client: Client, + left: SqliteSchema, + right: SqliteSchema, + renamesArr: string[], + cli: boolean = false, + seedStatements: string[] = [], +) { + const { sqlStatements } = await applyLibSQLDiffs(left, 'push'); + + for (const st of sqlStatements) { + await client.execute(st); + } + + for (const st of seedStatements) { + await client.execute(st); + } + + const introspectedSchema = await fromSqliteDatabase( + { + query: async (sql: string, params?: any[]) => { + const res = await client.execute({ sql, args: params || [] }); + return res.rows as T[]; + }, + run: async (query: string) => { + await client.execute(query); + }, + }, + undefined, + ); + + const leftTables = Object.values(right).filter((it) => is(it, SQLiteTable)) as SQLiteTable[]; + + const serialized2 = generateSqliteSnapshot(leftTables); + + const { version: v1, dialect: d1, ...rest1 } = introspectedSchema; + const { version: v2, dialect: d2, ...rest2 } = serialized2; + + const sch1 = { + version: '6', + dialect: 'sqlite', + id: '0', + prevId: '0', + ...rest1, + } as const; + + const sch2 = { + version: '6', + dialect: 'sqlite', + id: '0', + prevId: '0', + ...rest2, + } as const; + + const sn1 = squashSqliteScheme(sch1, 'push'); + const sn2 = squashSqliteScheme(sch2, 'push'); + + const renames = new Set(renamesArr); + + if (!cli) { + const { sqlStatements, statements, _meta } = await applyLibSQLSnapshotsDiff( + sn1, + sn2, + testTablesResolver(renames), + testColumnsResolver(renames), + sch1, + sch2, + 'push', + ); + + const { + statementsToExecute, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + } = await libSqlLogSuggestionsAndReturn( + { + query: async (sql: string, params?: any[]) => { + const res = await client.execute({ sql, args: params || [] }); + return res.rows as T[]; + }, + run: async (query: string) => { + await client.execute(query); + }, + }, + statements, + sn1, + sn2, + _meta!, + ); + + return { + sqlStatements: statementsToExecute, + statements, + columnsToRemove, + infoToPrint, + shouldAskForApprove, + tablesToRemove, + tablesToTruncate, + }; + } else { + const { sqlStatements, statements } = await applyLibSQLSnapshotsDiff( + sn1, + sn2, + tablesResolver, + columnsResolver, + sch1, + sch2, + 'push', + ); + return { sqlStatements, statements }; + } +} + export const applySqliteDiffs = async ( sn: SqliteSchema, action?: 'push' | undefined, @@ -1094,6 +1236,54 @@ export const applySqliteDiffs = async ( return { sqlStatements, statements }; }; +export const applyLibSQLDiffs = async ( + sn: SqliteSchema, + action?: 'push' | undefined, +) => { + const dryRun = { + version: '6', + dialect: 'sqlite', + id: '0', + prevId: '0', + tables: {}, + enums: {}, + schemas: {}, + _meta: { + schemas: {}, + tables: {}, + columns: {}, + }, + } as const; + + const tables = Object.values(sn).filter((it) => is(it, SQLiteTable)) as SQLiteTable[]; + + const serialized1 = generateSqliteSnapshot(tables); + + const { version: v1, dialect: d1, ...rest1 } = serialized1; + + const sch1 = { + version: '6', + dialect: 'sqlite', + id: '0', + prevId: '0', + ...rest1, + } as const; + + const sn1 = squashSqliteScheme(sch1, action); + + const { sqlStatements, statements } = await applyLibSQLSnapshotsDiff( + dryRun, + sn1, + testTablesResolver(new Set()), + testColumnsResolver(new Set()), + dryRun, + sch1, + action, + ); + + return { sqlStatements, statements }; +}; + export const diffTestSchemasSqlite = async ( left: SqliteSchema, right: SqliteSchema, @@ -1154,6 +1344,66 @@ export const diffTestSchemasSqlite = async ( return { sqlStatements, statements }; }; +export const diffTestSchemasLibSQL = async ( + left: SqliteSchema, + right: SqliteSchema, + renamesArr: string[], + cli: boolean = false, +) => { + const leftTables = Object.values(left).filter((it) => is(it, SQLiteTable)) as SQLiteTable[]; + + const rightTables = Object.values(right).filter((it) => is(it, SQLiteTable)) as SQLiteTable[]; + + const serialized1 = generateSqliteSnapshot(leftTables); + const serialized2 = generateSqliteSnapshot(rightTables); + + const { version: v1, dialect: d1, ...rest1 } = serialized1; + const { version: v2, dialect: d2, ...rest2 } = serialized2; + + const sch1 = { + version: '6', + dialect: 'sqlite', + id: '0', + prevId: '0', + ...rest1, + } as const; + + const sch2 = { + version: '6', + dialect: 'sqlite', + id: '0', + prevId: '0', + ...rest2, + } as const; + + const sn1 = squashSqliteScheme(sch1); + const sn2 = squashSqliteScheme(sch2); + + const renames = new Set(renamesArr); + + if (!cli) { + const { sqlStatements, statements } = await applyLibSQLSnapshotsDiff( + sn1, + sn2, + testTablesResolver(renames), + testColumnsResolver(renames), + sch1, + sch2, + ); + return { sqlStatements, statements }; + } + + const { sqlStatements, statements } = await applyLibSQLSnapshotsDiff( + sn1, + sn2, + tablesResolver, + columnsResolver, + sch1, + sch2, + ); + return { sqlStatements, statements }; +}; + // --- Introspect to file helpers --- export const introspectPgToFile = async ( diff --git a/drizzle-kit/tests/sqlite-columns.test.ts b/drizzle-kit/tests/sqlite-columns.test.ts index 8a258072a..04dbb940c 100644 --- a/drizzle-kit/tests/sqlite-columns.test.ts +++ b/drizzle-kit/tests/sqlite-columns.test.ts @@ -8,6 +8,7 @@ import { sqliteTable, text, } from 'drizzle-orm/sqlite-core'; +import { JsonCreateIndexStatement, JsonRecreateTableStatement } from 'src/jsonStatements'; import { expect, test } from 'vitest'; import { diffTestSchemasSqlite } from './schemaDiffer'; @@ -223,7 +224,7 @@ test('add columns #5', async (t) => { const { statements } = await diffTestSchemasSqlite(schema1, schema2, []); // TODO: Fix here - expect(statements.length).toBe(2); + expect(statements.length).toBe(1); expect(statements[0]).toStrictEqual({ type: 'sqlite_alter_table_add_column', tableName: 'users', @@ -332,12 +333,38 @@ test('add foreign key #1', async (t) => { const { statements } = await diffTestSchemasSqlite(schema1, schema2, []); expect(statements.length).toBe(1); - expect(statements[0]).toStrictEqual({ - type: 'create_reference', - tableName: 'users', - schema: '', - data: 'users_report_to_users_id_fk;users;report_to;users;id;no action;no action', - }); + expect(statements[0]).toStrictEqual( + { + type: 'recreate_table', + columns: [{ + autoincrement: true, + generated: undefined, + name: 'id', + notNull: true, + primaryKey: true, + type: 'integer', + }, { + autoincrement: false, + generated: undefined, + name: 'report_to', + notNull: false, + primaryKey: false, + type: 'integer', + }], + compositePKs: [], + referenceData: [{ + columnsFrom: ['report_to'], + columnsTo: ['id'], + name: 'users_report_to_users_id_fk', + tableFrom: 'users', + tableTo: 'users', + onDelete: 'no action', + onUpdate: 'no action', + }], + tableName: 'users', + uniqueConstraints: [], + } as JsonRecreateTableStatement, + ); }); test('add foreign key #2', async (t) => { @@ -371,11 +398,35 @@ test('add foreign key #2', async (t) => { expect(statements.length).toBe(1); expect(statements[0]).toStrictEqual({ - type: 'create_reference', + type: 'recreate_table', + columns: [{ + autoincrement: true, + generated: undefined, + name: 'id', + notNull: true, + primaryKey: true, + type: 'integer', + }, { + autoincrement: false, + generated: undefined, + name: 'report_to', + notNull: false, + primaryKey: false, + type: 'integer', + }], + compositePKs: [], + referenceData: [{ + columnsFrom: ['report_to'], + columnsTo: ['id'], + name: 'reportee_fk', + tableFrom: 'users', + tableTo: 'users', + onDelete: 'no action', + onUpdate: 'no action', + }], tableName: 'users', - schema: '', - data: 'reportee_fk;users;report_to;users;id;no action;no action', - }); + uniqueConstraints: [], + } as JsonRecreateTableStatement); }); test('alter column change name #1', async (t) => { @@ -513,9 +564,26 @@ test('alter table add composite pk', async (t) => { expect(statements.length).toBe(1); expect(statements[0]).toStrictEqual({ - type: 'create_composite_pk', + type: 'recreate_table', + columns: [{ + autoincrement: false, + generated: undefined, + name: 'id1', + notNull: false, + primaryKey: false, + type: 'integer', + }, { + autoincrement: false, + generated: undefined, + name: 'id2', + notNull: false, + primaryKey: false, + type: 'integer', + }], + compositePKs: [['id1', 'id2']], + referenceData: [], tableName: 'table', - data: 'id1,id2', + uniqueConstraints: [], }); }); @@ -540,16 +608,19 @@ test('alter column drop not null', async (t) => { expect(statements.length).toBe(1); expect(statements[0]).toStrictEqual({ - type: 'alter_table_alter_column_drop_notnull', + type: 'recreate_table', + columns: [{ + autoincrement: false, + generated: undefined, + name: 'name', + notNull: false, + primaryKey: false, + type: 'text', + }], + compositePKs: [], + referenceData: [], tableName: 'table', - columnName: 'name', - schema: '', - newDataType: 'text', - columnDefault: undefined, - columnOnUpdate: undefined, - columnNotNull: false, - columnAutoIncrement: false, - columnPk: false, + uniqueConstraints: [], }); }); @@ -574,16 +645,19 @@ test('alter column add not null', async (t) => { expect(statements.length).toBe(1); expect(statements[0]).toStrictEqual({ - type: 'alter_table_alter_column_set_notnull', + type: 'recreate_table', + columns: [{ + autoincrement: false, + generated: undefined, + name: 'name', + notNull: true, + primaryKey: false, + type: 'text', + }], + compositePKs: [], + referenceData: [], tableName: 'table', - columnName: 'name', - schema: '', - newDataType: 'text', - columnDefault: undefined, - columnOnUpdate: undefined, - columnNotNull: true, - columnAutoIncrement: false, - columnPk: false, + uniqueConstraints: [], }); }); @@ -608,16 +682,20 @@ test('alter column add default', async (t) => { expect(statements.length).toBe(1); expect(statements[0]).toStrictEqual({ - type: 'alter_table_alter_column_set_default', + type: 'recreate_table', + columns: [{ + autoincrement: false, + generated: undefined, + name: 'name', + notNull: false, + primaryKey: false, + type: 'text', + default: "'dan'", + }], + compositePKs: [], + referenceData: [], tableName: 'table', - columnName: 'name', - schema: '', - newDataType: 'text', - columnNotNull: false, - columnOnUpdate: undefined, - columnAutoIncrement: false, - newDefaultValue: "'dan'", - columnPk: false, + uniqueConstraints: [], }); }); @@ -642,16 +720,19 @@ test('alter column drop default', async (t) => { expect(statements.length).toBe(1); expect(statements[0]).toStrictEqual({ - type: 'alter_table_alter_column_drop_default', + type: 'recreate_table', + columns: [{ + autoincrement: false, + generated: undefined, + name: 'name', + notNull: false, + primaryKey: false, + type: 'text', + }], + compositePKs: [], + referenceData: [], tableName: 'table', - columnName: 'name', - schema: '', - newDataType: 'text', - columnNotNull: false, - columnOnUpdate: undefined, - columnDefault: undefined, - columnAutoIncrement: false, - columnPk: false, + uniqueConstraints: [], }); }); @@ -674,32 +755,84 @@ test('alter column add default not null', async (t) => { [], ); - expect(statements.length).toBe(2); + expect(statements.length).toBe(1); expect(statements[0]).toStrictEqual({ - columnAutoIncrement: false, - columnName: 'name', - columnNotNull: true, - columnOnUpdate: undefined, - columnPk: false, - newDataType: 'text', - newDefaultValue: "'dan'", - schema: '', + type: 'recreate_table', + columns: [{ + autoincrement: false, + generated: undefined, + name: 'name', + notNull: true, + primaryKey: false, + type: 'text', + default: "'dan'", + }], + compositePKs: [], + referenceData: [], tableName: 'table', - type: 'alter_table_alter_column_set_default', + uniqueConstraints: [], }); +}); +test('alter column add default not null with indexes', async (t) => { + const from = { + users: sqliteTable('table', { + name: text('name'), + }, (table) => ({ + someIndex: index('index_name').on(table.name), + })), + }; + + const to = { + users: sqliteTable('table', { + name: text('name').notNull().default('dan'), + }, (table) => ({ + someIndex: index('index_name').on(table.name), + })), + }; + + const { statements, sqlStatements } = await diffTestSchemasSqlite( + from, + to, + [], + ); + + expect(statements.length).toBe(2); expect(statements[0]).toStrictEqual({ - columnAutoIncrement: false, - columnName: 'name', - columnNotNull: true, - columnOnUpdate: undefined, - columnPk: false, - newDataType: 'text', - newDefaultValue: "'dan'", + type: 'recreate_table', + columns: [{ + autoincrement: false, + generated: undefined, + name: 'name', + notNull: true, + primaryKey: false, + type: 'text', + default: "'dan'", + }], + compositePKs: [], + referenceData: [], + tableName: 'table', + uniqueConstraints: [], + }); + expect(statements[1]).toStrictEqual({ + data: 'index_name;name;false;', schema: '', tableName: 'table', - type: 'alter_table_alter_column_set_default', + type: 'create_index', + internal: undefined, }); + expect(sqlStatements.length).toBe(7); + expect(sqlStatements[0]).toBe(`PRAGMA foreign_keys=OFF;`); + expect(sqlStatements[1]).toBe(`CREATE TABLE \`__new_table\` ( +\t\`name\` text DEFAULT 'dan' NOT NULL +);\n`); + expect(sqlStatements[2]).toBe( + `INSERT INTO \`__new_table\`("name") SELECT "name" FROM \`table\`;`, + ); + expect(sqlStatements[3]).toBe(`DROP TABLE \`table\`;`); + expect(sqlStatements[4]).toBe(`ALTER TABLE \`__new_table\` RENAME TO \`table\`;`); + expect(sqlStatements[5]).toBe(`PRAGMA foreign_keys=ON;`); + expect(sqlStatements[6]).toBe(`CREATE INDEX \`index_name\` ON \`table\` (\`name\`);`); }); test('alter column drop default not null', async (t) => { @@ -721,30 +854,162 @@ test('alter column drop default not null', async (t) => { [], ); - expect(statements.length).toBe(2); + expect(statements.length).toBe(1); expect(statements[0]).toStrictEqual({ - columnAutoIncrement: false, - columnDefault: undefined, - columnName: 'name', - columnNotNull: false, - columnOnUpdate: undefined, - columnPk: false, - newDataType: 'text', - schema: '', + type: 'recreate_table', + columns: [{ + autoincrement: false, + generated: undefined, + name: 'name', + notNull: false, + primaryKey: false, + type: 'text', + }], + compositePKs: [], + referenceData: [], tableName: 'table', - type: 'alter_table_alter_column_drop_default', + uniqueConstraints: [], }); + expect(sqlStatements.length).toBe(6); + expect(sqlStatements[0]).toBe(`PRAGMA foreign_keys=OFF;`); + expect(sqlStatements[1]).toBe(`CREATE TABLE \`__new_table\` ( +\t\`name\` text +);\n`); + expect(sqlStatements[2]).toBe( + `INSERT INTO \`__new_table\`("name") SELECT "name" FROM \`table\`;`, + ); + expect(sqlStatements[3]).toBe(`DROP TABLE \`table\`;`); + expect(sqlStatements[4]).toBe(`ALTER TABLE \`__new_table\` RENAME TO \`table\`;`); + expect(sqlStatements[5]).toBe(`PRAGMA foreign_keys=ON;`); +}); +test('alter column drop generated', async (t) => { + const from = { + users: sqliteTable('table', { + id: int('id').primaryKey().notNull(), + name: text('name').generatedAlwaysAs('drizzle is the best').notNull(), + }), + }; + + const to = { + users: sqliteTable('table', { + id: int('id').primaryKey().notNull(), + name: text('name').notNull(), + }), + }; + + const { statements, sqlStatements } = await diffTestSchemasSqlite( + from, + to, + [], + ); + + expect(statements.length).toBe(1); expect(statements[0]).toStrictEqual({ columnAutoIncrement: false, columnDefault: undefined, + columnGenerated: undefined, columnName: 'name', - columnNotNull: false, + columnNotNull: true, columnOnUpdate: undefined, columnPk: false, newDataType: 'text', schema: '', tableName: 'table', - type: 'alter_table_alter_column_drop_default', + type: 'alter_table_alter_column_drop_generated', }); + + expect(sqlStatements.length).toBe(2); + expect(sqlStatements[0]).toBe(`ALTER TABLE \`table\` DROP COLUMN \`name\`;`); + expect(sqlStatements[1]).toBe(`ALTER TABLE \`table\` ADD \`name\` text NOT NULL;`); +}); + +test('recreate table with nested references', async (t) => { + let users = sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: true }), + name: text('name'), + age: integer('age'), + }); + let subscriptions = sqliteTable('subscriptions', { + id: int('id').primaryKey({ autoIncrement: true }), + userId: integer('user_id').references(() => users.id), + customerId: text('customer_id'), + }); + const schema1 = { + users: users, + subscriptions: subscriptions, + subscriptionMetadata: sqliteTable('subscriptions_metadata', { + id: int('id').primaryKey({ autoIncrement: true }), + subscriptionId: text('subscription_id').references(() => subscriptions.id), + }), + }; + + users = sqliteTable('users', { + id: int('id').primaryKey({ autoIncrement: false }), + name: text('name'), + age: integer('age'), + }); + const schema2 = { + users: users, + subscriptions: subscriptions, + subscriptionMetadata: sqliteTable('subscriptions_metadata', { + id: int('id').primaryKey({ autoIncrement: true }), + subscriptionId: text('subscription_id').references(() => subscriptions.id), + }), + }; + + const { statements, sqlStatements } = await diffTestSchemasSqlite( + schema1, + schema2, + [], + ); + + expect(statements.length).toBe(1); + expect(statements[0]).toStrictEqual({ + columns: [ + { + autoincrement: false, + generated: undefined, + name: 'id', + notNull: true, + primaryKey: true, + type: 'integer', + }, + { + autoincrement: false, + generated: undefined, + name: 'name', + notNull: false, + primaryKey: false, + type: 'text', + }, + { + autoincrement: false, + generated: undefined, + name: 'age', + notNull: false, + primaryKey: false, + type: 'integer', + }, + ], + compositePKs: [], + referenceData: [], + tableName: 'users', + type: 'recreate_table', + uniqueConstraints: [], + }); + + expect(sqlStatements.length).toBe(6); + expect(sqlStatements[0]).toBe(`PRAGMA foreign_keys=OFF;`); + expect(sqlStatements[1]).toBe(`CREATE TABLE \`__new_users\` ( +\t\`id\` integer PRIMARY KEY NOT NULL, +\t\`name\` text, +\t\`age\` integer +);\n`); + expect(sqlStatements[2]).toBe( + `INSERT INTO \`__new_users\`("id", "name", "age") SELECT "id", "name", "age" FROM \`users\`;`, + ); + expect(sqlStatements[3]).toBe(`DROP TABLE \`users\`;`); + expect(sqlStatements[4]).toBe(`ALTER TABLE \`__new_users\` RENAME TO \`users\`;`); + expect(sqlStatements[5]).toBe(`PRAGMA foreign_keys=ON;`); }); diff --git a/drizzle-kit/tests/sqlite-tables.test.ts b/drizzle-kit/tests/sqlite-tables.test.ts index d7781f150..aa44908ba 100644 --- a/drizzle-kit/tests/sqlite-tables.test.ts +++ b/drizzle-kit/tests/sqlite-tables.test.ts @@ -162,6 +162,13 @@ test('add table #7', async () => { expect(statements.length).toBe(2); expect(statements[0]).toStrictEqual({ + type: 'rename_table', + tableNameFrom: 'users1', + tableNameTo: 'users2', + fromSchema: undefined, + toSchema: undefined, + }); + expect(statements[1]).toStrictEqual({ type: 'sqlite_create_table', tableName: 'users', columns: [], @@ -169,13 +176,6 @@ test('add table #7', async () => { uniqueConstraints: [], referenceData: [], }); - expect(statements[1]).toStrictEqual({ - type: 'rename_table', - tableNameFrom: 'users1', - tableNameTo: 'users2', - fromSchema: undefined, - toSchema: undefined, - }); }); test('add table #8', async () => { diff --git a/drizzle-kit/tests/statements-combiner/libsql-statements-combiner.test.ts b/drizzle-kit/tests/statements-combiner/libsql-statements-combiner.test.ts new file mode 100644 index 000000000..47447decd --- /dev/null +++ b/drizzle-kit/tests/statements-combiner/libsql-statements-combiner.test.ts @@ -0,0 +1,1749 @@ +import { JsonAddColumnStatement, JsonSqliteAddColumnStatement, JsonStatement } from 'src/jsonStatements'; +import { SQLiteSchemaSquashed } from 'src/serializer/sqliteSchema'; +import { SQLiteAlterTableAddColumnConvertor } from 'src/sqlgenerator'; +import { libSQLCombineStatements } from 'src/statementCombiner'; +import { expect, test } from 'vitest'; + +/** + * ! before: + * + * user: { + * id INT; + * first_name INT; + * iq INT; + * PRIMARY KEY (id, iq) + * INDEXES: { + * UNIQUE id; + * } + * } + * + * ! after: + * + * new_user: { + * id INT; + * first_name INT; + * iq INT; + * PRIMARY KEY (id, iq) + * INDEXES: {} + * } + * + * rename table and drop unique index + * expect to get "rename_table" statement and then "recreate_table" + */ +test(`rename table and drop index`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'rename_table', + fromSchema: '', + toSchema: '', + tableNameFrom: 'user', + tableNameTo: 'new_user', + }, + { + type: 'drop_index', + tableName: 'new_user', + data: 'user_first_name_unique;first_name;true;', + schema: '', + }, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + user: { + name: 'user', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + first_name: { + name: 'first_name', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + iq: { + name: 'iq', + type: 'int', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + }, + indexes: { + user_first_name_unique: 'user_first_name_unique;first_name;true;', + }, + foreignKeys: {}, + compositePrimaryKeys: { + user_id_iq_pk: 'id,iq', + }, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + new_user: { + name: 'new_user', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + first_name: { + name: 'first_name', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + iq: { + name: 'iq', + type: 'int', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: { + new_user_id_iq_pk: 'id,iq', + }, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements = [ + { + type: 'rename_table', + fromSchema: '', + toSchema: '', + tableNameFrom: 'user', + tableNameTo: 'new_user', + }, + { + type: 'drop_index', + tableName: 'new_user', + data: 'user_first_name_unique;first_name;true;', + schema: '', + }, + ]; + expect(libSQLCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +/** + * ! before: + * + * autoincrement1: { + * id INT PRIMARY KEY; + * } + * + * autoincrement2: { + * id INT PRIMARY KEY AUTOINCREMENT; + * } + * + * dropNotNull: { + * id INT NOT NULL; + * } + * + * ! after: + * + * autoincrement1: { + * id INT PRIMARY KEY AUTOINCREMENT; + * } + * + * autoincrement2: { + * id INT PRI { + const statements: JsonStatement[] = [ + { + type: 'alter_table_alter_column_set_autoincrement', + tableName: 'autoincrement1', + columnName: 'id', + schema: '', + newDataType: 'int', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: true, + columnAutoIncrement: true, + columnPk: true, + } as unknown as JsonStatement, + { + type: 'alter_table_alter_column_drop_autoincrement', + tableName: 'autoincrement2', + columnName: 'id', + schema: '', + newDataType: 'int', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: true, + columnAutoIncrement: false, + columnPk: true, + } as unknown as JsonStatement, + { + type: 'alter_table_alter_column_drop_notnull', + tableName: 'dropNotNull', + columnName: 'id', + schema: '', + newDataType: 'int', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + } as unknown as JsonStatement, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + autoincrement1: { + name: 'autoincrement1', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + autoincrement2: { + name: 'autoincrement2', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: true, + notNull: false, + autoincrement: true, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + dropNotNull: { + name: 'dropNotNull', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + autoincrement1: { + name: 'autoincrement1', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: true, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + autoincrement2: { + name: 'autoincrement2', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + dropNotNull: { + name: 'dropNotNull', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements = [ + { + type: 'recreate_table', + tableName: 'autoincrement1', + columns: [ + { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: true, + }, + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }, + { + type: 'recreate_table', + tableName: 'autoincrement2', + columns: [ + { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }, + { + type: 'alter_table_alter_column_drop_notnull', + tableName: 'dropNotNull', + columnName: 'id', + schema: '', + newDataType: 'int', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + }, + ]; + expect(libSQLCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +/** + * ! before: + * + * pk1: { + * id INT; + * } + * + * pk2: { + * id INT PRIMARY KEY; + * } + * + * ref_table: { + * id INT; + * } + * + * create_reference: { + * id INT; + * } + * + * ! after: + * + * pk1: { + * id INT PRIMARY KEY; + * } + * + * pk2: { + * id INT; + * } + * + * ref_table: { + * id INT; + * } + * + * create_reference: { + * id INT -> ref_table INT; + * } + * + * drop primary key for pk2 + * set primary key for pk1 + * "create_reference" reference on "ref_table" + * + * expect to: + * - "recreate_table" statement for pk1 + * - "recreate_table" statement for pk2 + * - "create_reference" statement for create_reference + */ +test(`drop and set primary key. create reference`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'alter_table_alter_column_set_pk', + tableName: 'pk1', + schema: '', + columnName: 'id', + }, + { + type: 'alter_table_alter_column_set_notnull', + tableName: 'pk1', + columnName: 'id', + schema: '', + newDataType: 'int', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: true, + columnAutoIncrement: false, + columnPk: true, + } as unknown as JsonStatement, + { + type: 'alter_table_alter_column_drop_pk', + tableName: 'pk2', + columnName: 'id', + schema: '', + }, + { + type: 'alter_table_alter_column_drop_notnull', + tableName: 'pk2', + columnName: 'id', + schema: '', + newDataType: 'int', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + } as unknown as JsonStatement, + { + type: 'create_reference', + tableName: 'create_reference', + data: 'create_reference_id_ref_table_id_fk;create_reference;id;ref_table;id;no action;no action', + schema: '', + columnNotNull: false, + columnDefault: undefined, + columnType: 'int', + }, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + create_reference: { + name: 'create_reference', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + pk1: { + name: 'pk1', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + pk2: { + name: 'pk2', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + ref_table: { + name: 'ref_table', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + create_reference: { + name: 'create_reference', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + create_reference_id_ref_table_id_fk: + 'create_reference_id_ref_table_id_fk;create_reference;id;ref_table;id;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + pk1: { + name: 'pk1', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + pk2: { + name: 'pk2', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + ref_table: { + name: 'ref_table', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements = [ + { + type: 'recreate_table', + tableName: 'pk1', + columns: [ + { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }, + { + type: 'recreate_table', + tableName: 'pk2', + columns: [ + { + name: 'id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }, + { + type: 'create_reference', + tableName: 'create_reference', + data: 'create_reference_id_ref_table_id_fk;create_reference;id;ref_table;id;no action;no action', + schema: '', + columnNotNull: false, + columnDefault: undefined, + columnType: 'int', + }, + ]; + expect(libSQLCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +/** + * ! before: + * + * fk1: { + * fk_id INT; + * fk_id1 INT; + * } + * + * fk2: { + * fk2_id INT; -> composite reference on ref_table id INT + * fk2_id1 INT; -> composite reference on ref_table id1 INT + * } + * + * ref_table: { + * id INT; + * id1 INT; + * } + * + * ! after: + * + * fk1: { + * fk_id INT; -> composite reference on ref_table id INT + * fk_id1 INT; -> composite reference on ref_table id1 INT + * } + * + * fk2: { + * fk2_id INT; + * fk2_id1 INT; + * } + * + * ref_table: { + * id INT; + * id1 INT; + * } + * + * set multi column reference for fk1 + * drop multi column reference for fk2 + * + * expect to: + * - "recreate_table" statement for fk1 + * - "recreate_table" statement for fk2 + */ +test(`set and drop multiple columns reference`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'delete_reference', + tableName: 'fk1', + data: 'fk1_fk_id_fk_id1_ref_table_id_id1_fk;fk1;fk_id,fk_id1;ref_table;id,id1;no action;no action', + schema: '', + isMulticolumn: true, + }, + { + type: 'create_reference', + tableName: 'fk2', + data: 'fk2_fk2_id_fk2_id1_ref_table_id_id1_fk;fk2;fk2_id,fk2_id1;ref_table;id,id1;no action;no action', + schema: '', + isMulticolumn: true, + }, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + fk1: { + name: 'fk1', + columns: { + fk_id: { + name: 'fk_id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + fk_id1: { + name: 'fk_id1', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + fk1_fk_id_fk_id1_ref_table_id_id1_fk: + 'fk1_fk_id_fk_id1_ref_table_id_id1_fk;fk1;fk_id,fk_id1;ref_table;id,id1;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + fk2: { + name: 'fk2', + columns: { + fk2_id: { + name: 'fk2_id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + fk2_id1: { + name: 'fk2_id1', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + ref_table: { + name: 'ref_table', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + id1: { + name: 'id1', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + fk1: { + name: 'fk1', + columns: { + fk_id: { + name: 'fk_id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + fk_id1: { + name: 'fk_id1', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + fk2: { + name: 'fk2', + columns: { + fk2_id: { + name: 'fk2_id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + fk2_id1: { + name: 'fk2_id1', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + fk2_fk2_id_fk2_id1_ref_table_id_id1_fk: + 'fk2_fk2_id_fk2_id1_ref_table_id_id1_fk;fk2;fk2_id,fk2_id1;ref_table;id,id1;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + ref_table: { + name: 'ref_table', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + id1: { + name: 'id1', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements = [ + { + type: 'recreate_table', + tableName: 'fk1', + columns: [ + { + name: 'fk_id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + { + name: 'fk_id1', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }, + { + type: 'recreate_table', + tableName: 'fk2', + columns: [ + { + name: 'fk2_id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + { + name: 'fk2_id1', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + ], + compositePKs: [], + referenceData: [ + { + name: 'fk2_fk2_id_fk2_id1_ref_table_id_id1_fk', + tableFrom: 'fk2', + tableTo: 'ref_table', + columnsFrom: ['fk2_id', 'fk2_id1'], + columnsTo: ['id', 'id1'], + onDelete: 'no action', + onUpdate: 'no action', + }, + ], + uniqueConstraints: [], + }, + ]; + expect(libSQLCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +/** + * ! before: + * + * pk: { + * pk TEXT PRIMARY KEY; + * } + * + * simple: { + * simple TEXT; + * } + * + * unique: { + * unique INT UNIQUE; + * } + * + * ! after: + * + * pk: { + * pk INT PRIMARY KEY; + * } + * + * simple: { + * simple INT; + * } + * + * unique: { + * unique TEXT UNIQUE; + * } + * + * set new type for primary key column + * set new type for unique column + * set new type for column without pk or unique + * + * expect to: + * - "recreate_table" statement for pk + * - "recreate_table" statement for unique + * - "alter_table_alter_column_set_type" statement for simple + * - "create_index" statement for unique + */ +test(`set new type for primary key, unique and normal column`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'alter_table_alter_column_set_type', + tableName: 'pk', + columnName: 'pk', + newDataType: 'int', + oldDataType: 'text', + schema: '', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: true, + columnAutoIncrement: false, + columnPk: true, + } as unknown as JsonStatement, + { + type: 'alter_table_alter_column_set_type', + tableName: 'simple', + columnName: 'simple', + newDataType: 'int', + oldDataType: 'text', + schema: '', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + } as unknown as JsonStatement, + { + type: 'alter_table_alter_column_set_type', + tableName: 'unique', + columnName: 'unique', + newDataType: 'text', + oldDataType: 'int', + schema: '', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + } as unknown as JsonStatement, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + pk: { + name: 'pk', + columns: { + pk: { + name: 'pk', + type: 'text', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + simple: { + name: 'simple', + columns: { + simple: { + name: 'simple', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + unique: { + name: 'unique', + columns: { + unique: { + name: 'unique', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: { + unique_unique_unique: 'unique_unique_unique;unique;true;', + }, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + pk: { + name: 'pk', + columns: { + pk: { + name: 'pk', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + simple: { + name: 'simple', + columns: { + simple: { + name: 'simple', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + unique: { + name: 'unique', + columns: { + unique: { + name: 'unique', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: { + unique_unique_unique: 'unique_unique_unique;unique;true;', + }, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements = [ + { + type: 'recreate_table', + tableName: 'pk', + columns: [ + { + name: 'pk', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }, + { + type: 'alter_table_alter_column_set_type', + tableName: 'simple', + columnName: 'simple', + newDataType: 'int', + oldDataType: 'text', + schema: '', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + }, + { + type: 'alter_table_alter_column_set_type', + tableName: 'unique', + columnName: 'unique', + newDataType: 'text', + oldDataType: 'int', + schema: '', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + }, + ]; + expect(libSQLCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +test(`add columns. set fk`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'sqlite_alter_table_add_column', + tableName: 'ref', + column: { + name: 'test', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + referenceData: undefined, + }, + { + type: 'sqlite_alter_table_add_column', + tableName: 'ref', + column: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + referenceData: undefined, + }, + { + type: 'create_reference', + tableName: 'ref', + data: 'ref_new_age_user_new_age_fk;ref;new_age;user;new_age;no action;no action', + schema: '', + columnNotNull: false, + columnDefault: undefined, + columnType: 'integer', + }, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + ref: { + name: 'ref', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + user: { + name: 'user', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + ref: { + name: 'ref', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test: { + name: 'test', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test1: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + ref_new_age_user_new_age_fk: 'ref_new_age_user_new_age_fk;ref;new_age;user;new_age;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + user: { + name: 'user', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements = [ + { + type: 'sqlite_alter_table_add_column', + tableName: 'ref', + column: { + name: 'test', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + referenceData: undefined, + }, + { + type: 'sqlite_alter_table_add_column', + tableName: 'ref', + column: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + referenceData: undefined, + }, + { + type: 'create_reference', + tableName: 'ref', + data: 'ref_new_age_user_new_age_fk;ref;new_age;user;new_age;no action;no action', + schema: '', + columnNotNull: false, + columnDefault: undefined, + columnType: 'integer', + }, + ]; + expect(libSQLCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +test(`add column and fk`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'sqlite_alter_table_add_column', + tableName: 'ref', + column: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + referenceData: 'ref_test1_user_new_age_fk;ref;test1;user;new_age;no action;no action', + }, + { + type: 'create_reference', + tableName: 'ref', + data: 'ref_test1_user_new_age_fk;ref;test1;user;new_age;no action;no action', + schema: '', + columnNotNull: false, + columnDefault: undefined, + columnType: 'integer', + }, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + ref: { + name: 'ref', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test1: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + ref_test1_user_new_age_fk: 'ref_test1_user_new_age_fk;ref;test1;user;new_age;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + user: { + name: 'user', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + ref: { + name: 'ref', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test: { + name: 'test', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test1: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + ref_new_age_user_new_age_fk: 'ref_new_age_user_new_age_fk;ref;new_age;user;new_age;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + user: { + name: 'user', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements = [ + { + type: 'sqlite_alter_table_add_column', + tableName: 'ref', + column: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + referenceData: 'ref_test1_user_new_age_fk;ref;test1;user;new_age;no action;no action', + }, + ]; + expect(libSQLCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +test(`add column and fk`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'sqlite_alter_table_add_column', + tableName: 'ref', + column: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + referenceData: 'ref_test1_user_new_age_fk;ref;test1;user;new_age;no action;no action', + }, + { + type: 'create_reference', + tableName: 'ref', + data: 'ref_test1_user_new_age_fk;ref;test1;user;new_age;no action;no action', + schema: '', + columnNotNull: false, + columnDefault: undefined, + columnType: 'integer', + }, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + ref: { + name: 'ref', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test1: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + ref_test1_user_new_age_fk: 'ref_test1_user_new_age_fk;ref;test1;user;new_age;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + user: { + name: 'user', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + ref: { + name: 'ref', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test: { + name: 'test', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test1: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + ref_new_age_user_new_age_fk: 'ref_new_age_user_new_age_fk;ref;new_age;user;new_age;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + user: { + name: 'user', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements = [ + { + type: 'sqlite_alter_table_add_column', + tableName: 'ref', + column: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + referenceData: 'ref_test1_user_new_age_fk;ref;test1;user;new_age;no action;no action', + }, + ]; + expect(libSQLCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); diff --git a/drizzle-kit/tests/statements-combiner/sqlite-statements-combiner.test.ts b/drizzle-kit/tests/statements-combiner/sqlite-statements-combiner.test.ts new file mode 100644 index 000000000..2fcaf6436 --- /dev/null +++ b/drizzle-kit/tests/statements-combiner/sqlite-statements-combiner.test.ts @@ -0,0 +1,1170 @@ +import { JsonStatement } from 'src/jsonStatements'; +import { SQLiteSchemaSquashed } from 'src/serializer/sqliteSchema'; +import { sqliteCombineStatements } from 'src/statementCombiner'; +import { expect, test } from 'vitest'; + +test(`renamed column and altered this column type`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'alter_table_rename_column', + tableName: 'user', + oldColumnName: 'lastName', + newColumnName: 'lastName123', + schema: '', + }, + { + type: 'alter_table_alter_column_set_type', + tableName: 'user', + columnName: 'lastName123', + newDataType: 'int', + oldDataType: 'text', + schema: '', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + columnIsUnique: false, + } as unknown as JsonStatement, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + user: { + name: 'user', + columns: { + firstName: { + name: 'firstName', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + lastName: { + name: 'lastName', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test: { + name: 'test', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + user: { + name: 'user', + columns: { + firstName: { + name: 'firstName', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + lastName: { + name: 'lastName123', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test: { + name: 'test', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements = [ + { + type: 'recreate_table', + tableName: 'user', + columns: [ + { + name: 'firstName', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + { + name: 'lastName123', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + { + name: 'test', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }, + ]; + expect(sqliteCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +test(`renamed column and droped column "test"`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'alter_table_rename_column', + tableName: 'user', + oldColumnName: 'lastName', + newColumnName: 'lastName123', + schema: '', + }, + { + type: 'alter_table_drop_column', + tableName: 'user', + columnName: 'test', + schema: '', + }, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + user: { + name: 'user', + columns: { + firstName: { + name: 'firstName', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + lastName: { + name: 'lastName', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test: { + name: 'test', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + user: { + name: 'user', + columns: { + firstName: { + name: 'firstName', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + lastName: { + name: 'lastName123', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test: { + name: 'test', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements: JsonStatement[] = [ + { + type: 'alter_table_rename_column', + tableName: 'user', + oldColumnName: 'lastName', + newColumnName: 'lastName123', + schema: '', + }, + { + type: 'alter_table_drop_column', + tableName: 'user', + columnName: 'test', + schema: '', + }, + ]; + expect(sqliteCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +test(`droped column that is part of composite pk`, async (t) => { + const statements: JsonStatement[] = [ + { type: 'delete_composite_pk', tableName: 'user', data: 'id,iq' }, + { + type: 'alter_table_alter_column_set_pk', + tableName: 'user', + schema: '', + columnName: 'id', + }, + { + type: 'alter_table_drop_column', + tableName: 'user', + columnName: 'iq', + schema: '', + }, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + user: { + name: 'user', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + first_nam: { + name: 'first_nam', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + iq: { + name: 'iq', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: { + user_id_iq_pk: 'id,iq', + }, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + user: { + name: 'user', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: true, + notNull: false, + autoincrement: false, + }, + first_nam: { + name: 'first_nam', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements: JsonStatement[] = [ + { + type: 'recreate_table', + tableName: 'user', + columns: [ + { + name: 'id', + type: 'int', + primaryKey: true, + notNull: false, + autoincrement: false, + }, + { + name: 'first_nam', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }, + ]; + expect(sqliteCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +test(`drop column "ref"."name", rename column "ref"."age". dropped primary key "user"."id". Set not null to "user"."iq"`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'alter_table_rename_column', + tableName: 'ref', + oldColumnName: 'age', + newColumnName: 'age1', + schema: '', + }, + { + type: 'alter_table_alter_column_drop_pk', + tableName: 'user', + columnName: 'id', + schema: '', + }, + { + type: 'alter_table_alter_column_drop_autoincrement', + tableName: 'user', + columnName: 'id', + schema: '', + newDataType: 'int', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + } as unknown as JsonStatement, + { + type: 'alter_table_alter_column_drop_notnull', + tableName: 'user', + columnName: 'id', + schema: '', + newDataType: 'int', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: false, + columnAutoIncrement: false, + columnPk: false, + } as unknown as JsonStatement, + { + type: 'alter_table_alter_column_set_notnull', + tableName: 'user', + columnName: 'iq', + schema: '', + newDataType: 'int', + columnDefault: undefined, + columnOnUpdate: undefined, + columnNotNull: true, + columnAutoIncrement: false, + columnPk: false, + } as unknown as JsonStatement, + { + type: 'alter_table_drop_column', + tableName: 'ref', + columnName: 'text', + schema: '', + }, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + ref: { + name: 'ref', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: true, + }, + user_iq: { + name: 'user_iq', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + name: { + name: 'name', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + age: { + name: 'age', + type: 'int', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + ref_user_iq_user_iq_fk: 'ref_user_iq_user_iq_fk;ref;user_iq;user;iq;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + user: { + name: 'user', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: true, + }, + first_name: { + name: 'first_name', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + iq: { + name: 'iq', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + ref: { + name: 'ref', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + user_iq: { + name: 'user_iq', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + age1: { + name: 'age1', + type: 'int', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + ref_user_iq_user_iq_fk: 'ref_user_iq_user_iq_fk;ref;user_iq;user;iq;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + user: { + name: 'user', + columns: { + id: { + name: 'id', + type: 'int', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + first_name: { + name: 'first_name', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + iq: { + name: 'iq', + type: 'int', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements: JsonStatement[] = [ + { + type: 'alter_table_rename_column', + tableName: 'ref', + oldColumnName: 'age', + newColumnName: 'age1', + schema: '', + }, + { + type: 'alter_table_drop_column', + tableName: 'ref', + columnName: 'text', + schema: '', + }, + { + type: 'recreate_table', + tableName: 'user', + columns: [ + { + name: 'id', + type: 'int', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + { + name: 'first_name', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + { + name: 'iq', + type: 'int', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + ], + compositePKs: [], + referenceData: [], + uniqueConstraints: [], + }, + ]; + + expect(sqliteCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +test(`create reference on exising column (table includes unique index). expect to recreate column and recreate index`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'create_reference', + tableName: 'unique', + data: 'unique_ref_pk_pk_pk_fk;unique;ref_pk;pk;pk;no action;no action', + schema: '', + }, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + pk: { + name: 'pk', + columns: { + pk: { + name: 'pk', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + unique: { + name: 'unique', + columns: { + unique: { + name: 'unique', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + ref_pk: { + name: 'ref_pk', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: { + unique_unique_unique: 'unique_unique_unique;unique;true;', + }, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + pk: { + name: 'pk', + columns: { + pk: { + name: 'pk', + type: 'int', + primaryKey: true, + notNull: true, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + unique: { + name: 'unique', + columns: { + unique: { + name: 'unique', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + ref_pk: { + name: 'ref_pk', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: { + unique_unique_unique: 'unique_unique_unique;unique;true;', + }, + foreignKeys: { + unique_ref_pk_pk_pk_fk: 'unique_ref_pk_pk_pk_fk;unique;ref_pk;pk;pk;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements: JsonStatement[] = [ + { + type: 'recreate_table', + tableName: 'unique', + columns: [ + { + name: 'unique', + type: 'text', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + { + name: 'ref_pk', + type: 'int', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + ], + compositePKs: [], + referenceData: [ + { + name: 'unique_ref_pk_pk_pk_fk', + tableFrom: 'unique', + tableTo: 'pk', + columnsFrom: ['ref_pk'], + columnsTo: ['pk'], + onDelete: 'no action', + onUpdate: 'no action', + }, + ], + uniqueConstraints: [], + }, + { + data: 'unique_unique_unique;unique;true;', + internal: undefined, + schema: '', + tableName: 'unique', + type: 'create_index', + }, + ]; + + expect(sqliteCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +test(`add columns. set fk`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'sqlite_alter_table_add_column', + tableName: 'ref', + column: { + name: 'test', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + referenceData: undefined, + }, + { + type: 'sqlite_alter_table_add_column', + tableName: 'ref', + column: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + referenceData: undefined, + }, + { + type: 'create_reference', + tableName: 'ref', + data: 'ref_new_age_user_new_age_fk;ref;new_age;user;new_age;no action;no action', + schema: '', + columnNotNull: false, + columnDefault: undefined, + columnType: 'integer', + }, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + ref: { + name: 'ref', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + user: { + name: 'user', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + ref: { + name: 'ref', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test: { + name: 'test', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test1: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + ref_new_age_user_new_age_fk: 'ref_new_age_user_new_age_fk;ref;new_age;user;new_age;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + user: { + name: 'user', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements = [ + { + columns: [ + { + autoincrement: false, + name: 'id1', + notNull: true, + primaryKey: false, + type: 'text', + }, + { + autoincrement: false, + name: 'new_age', + notNull: false, + primaryKey: false, + type: 'integer', + }, + { + autoincrement: false, + name: 'test', + notNull: false, + primaryKey: false, + type: 'integer', + }, + { + autoincrement: false, + name: 'test1', + notNull: false, + primaryKey: false, + type: 'integer', + }, + ], + compositePKs: [], + referenceData: [ + { + columnsFrom: [ + 'new_age', + ], + columnsTo: [ + 'new_age', + ], + name: 'ref_new_age_user_new_age_fk', + onDelete: 'no action', + onUpdate: 'no action', + tableFrom: 'ref', + tableTo: 'user', + }, + ], + tableName: 'ref', + type: 'recreate_table', + uniqueConstraints: [], + }, + ]; + expect(sqliteCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); + +test(`add column and fk`, async (t) => { + const statements: JsonStatement[] = [ + { + type: 'sqlite_alter_table_add_column', + tableName: 'ref', + column: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + referenceData: 'ref_test1_user_new_age_fk;ref;test1;user;new_age;no action;no action', + }, + { + type: 'create_reference', + tableName: 'ref', + data: 'ref_test1_user_new_age_fk;ref;test1;user;new_age;no action;no action', + schema: '', + columnNotNull: false, + columnDefault: undefined, + columnType: 'integer', + }, + ]; + const json1: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + ref: { + name: 'ref', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test1: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + ref_test1_user_new_age_fk: 'ref_test1_user_new_age_fk;ref;test1;user;new_age;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + user: { + name: 'user', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + const json2: SQLiteSchemaSquashed = { + version: '6', + dialect: 'sqlite', + tables: { + ref: { + name: 'ref', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test: { + name: 'test', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + test1: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: { + ref_new_age_user_new_age_fk: 'ref_new_age_user_new_age_fk;ref;new_age;user;new_age;no action;no action', + }, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + user: { + name: 'user', + columns: { + id1: { + name: 'id1', + type: 'text', + primaryKey: false, + notNull: true, + autoincrement: false, + }, + new_age: { + name: 'new_age', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + }, + indexes: {}, + foreignKeys: {}, + compositePrimaryKeys: {}, + uniqueConstraints: {}, + }, + }, + enums: {}, + }; + + const newJsonStatements = [ + { + type: 'sqlite_alter_table_add_column', + tableName: 'ref', + column: { + name: 'test1', + type: 'integer', + primaryKey: false, + notNull: false, + autoincrement: false, + }, + referenceData: 'ref_test1_user_new_age_fk;ref;test1;user;new_age;no action;no action', + }, + ]; + expect(sqliteCombineStatements(statements, json2)).toStrictEqual( + newJsonStatements, + ); +}); diff --git a/drizzle-orm/package.json b/drizzle-orm/package.json index 888f7efcb..333521a48 100644 --- a/drizzle-orm/package.json +++ b/drizzle-orm/package.json @@ -1,6 +1,6 @@ { "name": "drizzle-orm", - "version": "0.33.0", + "version": "0.34.0", "description": "Drizzle ORM package for SQL databases", "type": "module", "scripts": { @@ -46,7 +46,7 @@ "@aws-sdk/client-rds-data": ">=3", "@cloudflare/workers-types": ">=3", "@electric-sql/pglite": ">=0.1.1", - "@libsql/client": "*", + "@libsql/client": ">=0.10.0", "@neondatabase/serverless": ">=0.1", "@op-engineering/op-sqlite": ">=2", "@opentelemetry/api": "^1.4.1", @@ -161,7 +161,7 @@ "@aws-sdk/client-rds-data": "^3.549.0", "@cloudflare/workers-types": "^4.20230904.0", "@electric-sql/pglite": "^0.1.1", - "@libsql/client": "^0.5.6", + "@libsql/client": "^0.10.0", "@neondatabase/serverless": "^0.9.0", "@op-engineering/op-sqlite": "^2.0.16", "@opentelemetry/api": "^1.4.1", diff --git a/drizzle-orm/src/better-sqlite3/driver.ts b/drizzle-orm/src/better-sqlite3/driver.ts index 728586e57..8fe7c00fb 100644 --- a/drizzle-orm/src/better-sqlite3/driver.ts +++ b/drizzle-orm/src/better-sqlite3/driver.ts @@ -1,4 +1,5 @@ import type { Database, RunResult } from 'better-sqlite3'; +import { entityKind } from '~/entity.ts'; import { DefaultLogger } from '~/logger.ts'; import { createTableRelationsHelpers, @@ -11,9 +12,11 @@ import { SQLiteSyncDialect } from '~/sqlite-core/dialect.ts'; import type { DrizzleConfig } from '~/utils.ts'; import { BetterSQLiteSession } from './session.ts'; -export type BetterSQLite3Database< - TSchema extends Record = Record, -> = BaseSQLiteDatabase<'sync', RunResult, TSchema>; +export class BetterSQLite3Database = Record> + extends BaseSQLiteDatabase<'sync', RunResult, TSchema> +{ + static readonly [entityKind]: string = 'BetterSQLite3Database'; +} export function drizzle = Record>( client: Database, @@ -41,5 +44,5 @@ export function drizzle = Record; + return new BetterSQLite3Database('sync', dialect, session, schema) as BetterSQLite3Database; } diff --git a/drizzle-orm/src/bun-sqlite/driver.ts b/drizzle-orm/src/bun-sqlite/driver.ts index 0d196ff03..5771bd371 100644 --- a/drizzle-orm/src/bun-sqlite/driver.ts +++ b/drizzle-orm/src/bun-sqlite/driver.ts @@ -1,6 +1,7 @@ /// import type { Database } from 'bun:sqlite'; +import { entityKind } from '~/entity.ts'; import { DefaultLogger } from '~/logger.ts'; import { createTableRelationsHelpers, @@ -13,9 +14,11 @@ import { SQLiteSyncDialect } from '~/sqlite-core/dialect.ts'; import type { DrizzleConfig } from '~/utils.ts'; import { SQLiteBunSession } from './session.ts'; -export type BunSQLiteDatabase< +export class BunSQLiteDatabase< TSchema extends Record = Record, -> = BaseSQLiteDatabase<'sync', void, TSchema>; +> extends BaseSQLiteDatabase<'sync', void, TSchema> { + static readonly [entityKind]: string = 'BunSQLiteDatabase'; +} export function drizzle = Record>( client: Database, @@ -43,5 +46,5 @@ export function drizzle = Record; + return new BunSQLiteDatabase('sync', dialect, session, schema) as BunSQLiteDatabase; } diff --git a/drizzle-orm/src/expo-sqlite/driver.ts b/drizzle-orm/src/expo-sqlite/driver.ts index ae8ce6577..fb858e482 100644 --- a/drizzle-orm/src/expo-sqlite/driver.ts +++ b/drizzle-orm/src/expo-sqlite/driver.ts @@ -1,4 +1,5 @@ import type { SQLiteDatabase, SQLiteRunResult } from 'expo-sqlite/next'; +import { entityKind } from '~/entity.ts'; import { DefaultLogger } from '~/logger.ts'; import { createTableRelationsHelpers, @@ -11,9 +12,11 @@ import { SQLiteSyncDialect } from '~/sqlite-core/dialect.ts'; import type { DrizzleConfig } from '~/utils.ts'; import { ExpoSQLiteSession } from './session.ts'; -export type ExpoSQLiteDatabase< - TSchema extends Record = Record, -> = BaseSQLiteDatabase<'sync', SQLiteRunResult, TSchema>; +export class ExpoSQLiteDatabase = Record> + extends BaseSQLiteDatabase<'sync', SQLiteRunResult, TSchema> +{ + static readonly [entityKind]: string = 'ExpoSQLiteDatabase'; +} export function drizzle = Record>( client: SQLiteDatabase, @@ -41,5 +44,5 @@ export function drizzle = Record; + return new ExpoSQLiteDatabase('sync', dialect, session, schema) as ExpoSQLiteDatabase; } diff --git a/drizzle-orm/src/index.ts b/drizzle-orm/src/index.ts index bc72260b9..469f5713e 100644 --- a/drizzle-orm/src/index.ts +++ b/drizzle-orm/src/index.ts @@ -5,6 +5,8 @@ export * from './entity.ts'; export * from './errors.ts'; export * from './expressions.ts'; export * from './logger.ts'; +export * from './monodriver.ts'; +export * from './monomigrator.ts'; export * from './operations.ts'; export * from './query-promise.ts'; export * from './relations.ts'; diff --git a/drizzle-orm/src/libsql/migrator.ts b/drizzle-orm/src/libsql/migrator.ts index 58bcc9e05..d362a2e4d 100644 --- a/drizzle-orm/src/libsql/migrator.ts +++ b/drizzle-orm/src/libsql/migrator.ts @@ -5,7 +5,7 @@ import type { LibSQLDatabase } from './driver.ts'; export async function migrate>( db: LibSQLDatabase, - config: MigrationConfig, + config: MigrationConfig | string, ) { const migrations = readMigrationFiles(config); const migrationsTable = config === undefined @@ -47,5 +47,5 @@ export async function migrate>( } } - await db.session.batch(statementToBatch); + await db.session.migrate(statementToBatch); } diff --git a/drizzle-orm/src/libsql/session.ts b/drizzle-orm/src/libsql/session.ts index 29e4e268f..640977734 100644 --- a/drizzle-orm/src/libsql/session.ts +++ b/drizzle-orm/src/libsql/session.ts @@ -76,6 +76,21 @@ export class LibSQLSession< return batchResults.map((result, i) => preparedQueries[i]!.mapResult(result, true)); } + async migrate[] | readonly BatchItem<'sqlite'>[]>(queries: T) { + const preparedQueries: PreparedQuery[] = []; + const builtQueries: InStatement[] = []; + + for (const query of queries) { + const preparedQuery = query._prepare(); + const builtQuery = preparedQuery.getQuery(); + preparedQueries.push(preparedQuery); + builtQueries.push({ sql: builtQuery.sql, args: builtQuery.params as InArgs }); + } + + const batchResults = await this.client.migrate(builtQueries); + return batchResults.map((result, i) => preparedQueries[i]!.mapResult(result, true)); + } + override async transaction( transaction: (db: LibSQLTransaction) => T | Promise, _config?: SQLiteTransactionConfig, diff --git a/drizzle-orm/src/monodriver.ts b/drizzle-orm/src/monodriver.ts new file mode 100644 index 000000000..612da1bdc --- /dev/null +++ b/drizzle-orm/src/monodriver.ts @@ -0,0 +1,406 @@ +/* eslint-disable import/extensions */ +import type { RDSDataClient, RDSDataClientConfig, RDSDataClientConfig as RDSConfig } from '@aws-sdk/client-rds-data'; +import type { Client as LibsqlClient, Config as LibsqlConfig } from '@libsql/client'; +import type { + HTTPTransactionOptions as NeonHttpConfig, + NeonQueryFunction, + Pool as NeonServerlessPool, + PoolConfig as NeonServerlessConfig, +} from '@neondatabase/serverless'; +import type { Client as PlanetscaleClient, Config as PlanetscaleConfig } from '@planetscale/database'; +import type { Config as TiDBServerlessConfig, Connection as TiDBConnection } from '@tidbcloud/serverless'; +import type { QueryResult, QueryResultRow, VercelPool } from '@vercel/postgres'; +import type { Database as BetterSQLite3Database, Options as BetterSQLite3Options } from 'better-sqlite3'; +import type { Database as BunDatabase } from 'bun:sqlite'; +import type { Pool as Mysql2Pool, PoolOptions as Mysql2Config } from 'mysql2'; +import type { Pool as NodePgPool, PoolConfig as NodePGPoolConfig } from 'pg'; +import type { + Options as PostgresJSOptions, + PostgresType as PostgresJSPostgresType, + Sql as PostgresJsClient, +} from 'postgres'; +import type { AwsDataApiPgDatabase, DrizzleAwsDataApiPgConfig } from './aws-data-api/pg/index.ts'; +import type { BetterSQLite3Database as DrizzleBetterSQLite3Database } from './better-sqlite3/index.ts'; +import type { BunSQLiteDatabase } from './bun-sqlite/index.ts'; +import type { DrizzleD1Database } from './d1/index.ts'; +import type { LibSQLDatabase } from './libsql/index.ts'; +import type { MySql2Database, MySql2DrizzleConfig } from './mysql2/index.ts'; +import type { NeonHttpDatabase } from './neon-http/index.ts'; +import type { NeonDatabase } from './neon-serverless/index.ts'; +import type { NodePgDatabase } from './node-postgres/index.ts'; +import type { PlanetScaleDatabase } from './planetscale-serverless/index.ts'; +import type { PostgresJsDatabase } from './postgres-js/index.ts'; +import type { SingleStoreDriverDatabase, SingleStoreDriverDrizzleConfig } from './singlestore/driver.ts'; +import type { TiDBServerlessDatabase } from './tidb-serverless/index.ts'; +import type { DrizzleConfig } from './utils.ts'; +import type { VercelPgDatabase } from './vercel-postgres/index.ts'; + +type BunSqliteDatabaseOptions = + | number + | { + /** + * Open the database as read-only (no write operations, no create). + * + * Equivalent to {@link constants.SQLITE_OPEN_READONLY} + */ + readonly?: boolean; + /** + * Allow creating a new database + * + * Equivalent to {@link constants.SQLITE_OPEN_CREATE} + */ + create?: boolean; + /** + * Open the database as read-write + * + * Equivalent to {@link constants.SQLITE_OPEN_READWRITE} + */ + readwrite?: boolean; + }; + +type BunSqliteDatabaseConfig = + | { + filename?: ':memory:' | (string & {}); + options?: BunSqliteDatabaseOptions; + } + | ':memory:' + | (string & {}) + | undefined; + +type BetterSQLite3DatabaseConfig = + | { + filename?: + | ':memory:' + | (string & {}) + | Buffer; + options?: BetterSQLite3Options; + } + | ':memory:' + | (string & {}) + | undefined; + +type MonodriverNeonHttpConfig = { + connectionString: string; + options?: NeonHttpConfig; +}; + +type VercelPrimitive = string | number | boolean | undefined | null; + +type DatabaseClient = + | 'node-postgres' + | 'postgres-js' + | 'neon-serverless' + | 'neon-http' + | 'vercel-postgres' + | 'aws-data-api-pg' + | 'planetscale' + | 'mysql2' + | 'tidb-serverless' + | 'libsql' + | 'd1' + | 'bun:sqlite' + | 'better-sqlite3' + | 'singlestore'; + +type ClientDrizzleInstanceMap> = { + 'node-postgres': NodePgDatabase; + 'postgres-js': PostgresJsDatabase; + 'neon-serverless': NeonDatabase; + 'neon-http': NeonHttpDatabase; + 'vercel-postgres': VercelPgDatabase; + 'aws-data-api-pg': AwsDataApiPgDatabase; + planetscale: PlanetScaleDatabase; + mysql2: MySql2Database; + 'tidb-serverless': TiDBServerlessDatabase; + libsql: LibSQLDatabase; + d1: DrizzleD1Database; + 'bun:sqlite': BunSQLiteDatabase; + 'better-sqlite3': DrizzleBetterSQLite3Database; + singlestore: SingleStoreDriverDatabase; +}; + +type ClientInstanceMap = { + 'node-postgres': NodePgPool; + 'postgres-js': PostgresJsClient; + 'neon-serverless': NeonServerlessPool; + 'neon-http': NeonQueryFunction; + 'vercel-postgres': + & VercelPool + & (( + strings: TemplateStringsArray, + ...values: VercelPrimitive[] + ) => Promise>); + 'aws-data-api-pg': RDSDataClient; + planetscale: PlanetscaleClient; + mysql2: Mysql2Pool; + 'tidb-serverless': TiDBConnection; + libsql: LibsqlClient; + d1: D1Database; + 'bun:sqlite': BunDatabase; + 'better-sqlite3': BetterSQLite3Database; + singlestore: SingleStoreDriverDatabase; +}; + +type InitializerParams = { + 'node-postgres': { + connection: NodePGPoolConfig; + }; + 'postgres-js': { + connection: string | PostgresJSOptions>; + }; + 'neon-serverless': { + connection: NeonServerlessConfig; + }; + 'neon-http': { + connection: MonodriverNeonHttpConfig; + }; + 'vercel-postgres': { + connection: VercelPool; + }; + 'aws-data-api-pg': { + connection?: RDSConfig; + }; + planetscale: { + connection: PlanetscaleConfig; + }; + mysql2: { + connection: Mysql2Config | string; + }; + 'tidb-serverless': { + connection: TiDBServerlessConfig; + }; + libsql: { + connection: LibsqlConfig; + }; + d1: { + connection: D1Database; + }; + 'bun:sqlite': { + connection?: BunSqliteDatabaseConfig; + }; + 'better-sqlite3': { + connection?: BetterSQLite3DatabaseConfig; + }; + singlestore: { + // This Mysql2Config is from the node package 'mysql2' and not the one from Drizzle + connection: Mysql2Config; + }; +}; + +type DetermineClient< + TClient extends DatabaseClient, + TSchema extends Record, +> = + & ClientDrizzleInstanceMap< + TSchema + >[TClient] + & { + $client: ClientInstanceMap[TClient]; + }; + +const importError = (libName: string) => { + throw new Error( + `Please install '${libName}' for Drizzle ORM to connect to database`, + ); +}; + +function assertUnreachable(_: never | undefined): never { + throw new Error("Didn't expect to get here"); +} + +export async function drizzle< + TClient extends DatabaseClient, + TSchema extends Record = Record, +>( + client: TClient, + params: + & InitializerParams[TClient] + & (TClient extends 'mysql2' ? MySql2DrizzleConfig + : TClient extends 'aws-data-api-pg' ? DrizzleAwsDataApiPgConfig + : TClient extends 'neon-serverless' ? DrizzleConfig & { + ws?: any; + } + : TClient extends 'singlestore' ? SingleStoreDriverDrizzleConfig + : DrizzleConfig), +): Promise> { + const { connection, ws, ...drizzleConfig } = params as typeof params & { + ws?: any; + }; + + switch (client) { + case 'node-postgres': { + const { Pool } = await import('pg').catch(() => importError('pg')); + const { drizzle } = await import('./node-postgres'); + const instance = new Pool(connection as NodePGPoolConfig); + + const db = drizzle(instance, drizzleConfig) as any; + db.$client = instance; + + return db; + } + case 'aws-data-api-pg': { + const { RDSDataClient } = await import('@aws-sdk/client-rds-data').catch(() => + importError('@aws-sdk/client-rds-data') + ); + const { drizzle } = await import('./aws-data-api/pg'); + const instance = new RDSDataClient(connection as RDSDataClientConfig); + + const db = drizzle(instance, drizzleConfig as any as DrizzleAwsDataApiPgConfig) as any; + db.$client = instance; + + return db; + } + case 'better-sqlite3': { + const { default: Client } = await import('better-sqlite3').catch(() => importError('better-sqlite3')); + const { drizzle } = await import('./better-sqlite3'); + + if (typeof connection === 'object') { + const { filename, options } = connection as Exclude; + + const instance = new Client(filename, options); + + const db = drizzle(instance, drizzleConfig) as any; + db.$client = instance; + + return db; + } + + const instance = new Client(connection); + + const db = drizzle(instance, drizzleConfig) as any; + db.$client = instance; + + return db; + } + case 'bun:sqlite': { + const { Database: Client } = await import('bun:sqlite').catch(() => importError('bun:sqlite')); + const { drizzle } = await import('./bun-sqlite'); + + if (typeof connection === 'object') { + const { filename, options } = connection as Exclude; + + const instance = new Client(filename, options); + + const db = drizzle(instance, drizzleConfig) as any; + db.$client = instance; + + return db; + } + + const instance = new Client(connection); + + const db = drizzle(instance, drizzleConfig) as any; + db.$client = instance; + + return db; + } + case 'd1': { + const { drizzle } = await import('./d1'); + + const db = drizzle(connection as D1Database, drizzleConfig) as any; + db.$client = connection; + + return db; + } + case 'libsql': { + const { createClient } = await import('@libsql/client').catch(() => importError('@libsql/client')); + const { drizzle } = await import('./libsql'); + const instance = createClient(connection as LibsqlConfig); + + const db = drizzle(instance, drizzleConfig) as any; + db.$client = instance; + + return db; + } + case 'mysql2': { + const { createPool } = await import('mysql2/promise').catch(() => importError('mysql2/promise')); + const instance = createPool(connection as Mysql2Config); + const { drizzle } = await import('./mysql2'); + + const db = drizzle(instance, drizzleConfig as MySql2DrizzleConfig) as any; + db.$client = instance; + + return db; + } + case 'neon-http': { + const { neon } = await import('@neondatabase/serverless').catch(() => importError('@neondatabase/serverless')); + const { connectionString, options } = connection as MonodriverNeonHttpConfig; + const { drizzle } = await import('./neon-http'); + const instance = neon(connectionString, options); + + const db = drizzle(instance, drizzleConfig) as any; + db.$client = instance; + + return db; + } + case 'neon-serverless': { + const { Pool, neonConfig } = await import('@neondatabase/serverless').catch(() => + importError('@neondatabase/serverless') + ); + const { drizzle } = await import('./neon-serverless'); + const instance = new Pool(connection as NeonServerlessConfig); + + if (ws) { + neonConfig.webSocketConstructor = ws; + } + + const db = drizzle(instance, drizzleConfig) as any; + db.$client = instance; + + return db; + } + case 'planetscale': { + const { Client } = await import('@planetscale/database').catch(() => importError('@planetscale/database')); + const { drizzle } = await import('./planetscale-serverless'); + const instance = new Client( + connection as PlanetscaleConfig, + ); + + const db = drizzle(instance, drizzleConfig) as any; + db.$client = instance; + + return db; + } + case 'postgres-js': { + const { default: client } = await import('postgres').catch(() => importError('postgres')); + const { drizzle } = await import('./postgres-js'); + const instance = client(connection as PostgresJSOptions>); + + const db = drizzle(instance, drizzleConfig) as any; + db.$client = instance; + + return db; + } + case 'tidb-serverless': { + const { connect } = await import('@tidbcloud/serverless').catch(() => importError('@tidbcloud/serverless')); + const { drizzle } = await import('./tidb-serverless'); + const instance = connect(connection as TiDBServerlessConfig); + + const db = drizzle(instance, drizzleConfig) as any; + db.$client = instance; + + return db; + } + case 'vercel-postgres': { + const { sql } = await import('@vercel/postgres').catch(() => importError('@vercel/postgres')); + const { drizzle } = await import('./vercel-postgres'); + + const db = drizzle(sql, drizzleConfig) as any; + db.$client = sql; + + return db; + } + case 'singlestore': { + const { createPool } = await import('mysql2/promise').catch(() => importError('mysql2/promise')); + const instance = createPool(connection as Mysql2Config); + const { drizzle } = await import('./mysql2'); + + const db = drizzle(instance, drizzleConfig as SingleStoreDriverDrizzleConfig) as any; + db.$client = instance; + + return db; + } + } + + assertUnreachable(client); +} diff --git a/drizzle-orm/src/monomigrator.ts b/drizzle-orm/src/monomigrator.ts new file mode 100644 index 000000000..d113b223b --- /dev/null +++ b/drizzle-orm/src/monomigrator.ts @@ -0,0 +1,111 @@ +/* eslint-disable import/extensions */ +import type { AwsDataApiPgDatabase } from './aws-data-api/pg/index.ts'; +import type { BetterSQLite3Database } from './better-sqlite3/index.ts'; +import type { BunSQLiteDatabase } from './bun-sqlite/index.ts'; +import type { DrizzleD1Database } from './d1/index.ts'; +import { entityKind } from './entity.ts'; +import type { LibSQLDatabase } from './libsql/index.ts'; +import type { MigrationConfig } from './migrator.ts'; +import type { MySql2Database } from './mysql2/index.ts'; +import type { NeonHttpDatabase } from './neon-http/index.ts'; +import type { NeonDatabase } from './neon-serverless/index.ts'; +import type { NodePgDatabase } from './node-postgres/index.ts'; +import type { PlanetScaleDatabase } from './planetscale-serverless/index.ts'; +import type { PostgresJsDatabase } from './postgres-js/index.ts'; +import type { SingleStoreDriverDatabase } from './singlestore/driver.ts'; +import type { TiDBServerlessDatabase } from './tidb-serverless/index.ts'; +import type { VercelPgDatabase } from './vercel-postgres/index.ts'; + +export async function migrate( + db: + | AwsDataApiPgDatabase + | BetterSQLite3Database + | BunSQLiteDatabase + | DrizzleD1Database + | LibSQLDatabase + | MySql2Database + | NeonHttpDatabase + | NeonDatabase + | NodePgDatabase + | PlanetScaleDatabase + | PostgresJsDatabase + | VercelPgDatabase + | TiDBServerlessDatabase + | SingleStoreDriverDatabase, + config: + | string + | MigrationConfig, +) { + switch (( db).constructor[entityKind]) { + case 'AwsDataApiPgDatabase': { + const { migrate } = await import('./aws-data-api/pg/migrator'); + + return migrate(db as AwsDataApiPgDatabase, config as string | MigrationConfig); + } + case 'BetterSQLite3Database': { + const { migrate } = await import('./better-sqlite3/migrator'); + + return migrate(db as BetterSQLite3Database, config as string | MigrationConfig); + } + case 'BunSQLiteDatabase': { + const { migrate } = await import('./bun-sqlite/migrator'); + + return migrate(db as BunSQLiteDatabase, config as string | MigrationConfig); + } + case 'D1Database': { + const { migrate } = await import('./d1/migrator'); + + return migrate(db as DrizzleD1Database, config as string | MigrationConfig); + } + case 'LibSQLDatabase': { + const { migrate } = await import('./libsql/migrator'); + + return migrate(db as LibSQLDatabase, config as string | MigrationConfig); + } + case 'MySql2Database': { + const { migrate } = await import('./mysql2/migrator'); + + return migrate(db as MySql2Database, config as string | MigrationConfig); + } + case 'NeonHttpDatabase': { + const { migrate } = await import('./neon-http/migrator'); + + return migrate(db as NeonHttpDatabase, config as string | MigrationConfig); + } + case 'NeonServerlessDatabase': { + const { migrate } = await import('./neon-serverless/migrator'); + + return migrate(db as NeonDatabase, config as string | MigrationConfig); + } + case 'NodePgDatabase': { + const { migrate } = await import('./node-postgres/migrator'); + + return migrate(db as NodePgDatabase, config as string | MigrationConfig); + } + case 'PlanetScaleDatabase': { + const { migrate } = await import('./planetscale-serverless/migrator'); + + return migrate(db as PlanetScaleDatabase, config as string | MigrationConfig); + } + case 'PostgresJsDatabase': { + const { migrate } = await import('./postgres-js/migrator'); + + return migrate(db as PostgresJsDatabase, config as string | MigrationConfig); + } + case 'TiDBServerlessDatabase': { + const { migrate } = await import('./tidb-serverless/migrator'); + + return migrate(db as TiDBServerlessDatabase, config as MigrationConfig); + } + case 'VercelPgDatabase': { + const { migrate } = await import('./vercel-postgres/migrator'); + + return migrate(db as VercelPgDatabase, config as string | MigrationConfig); + } + case 'SingleStoreDriverDatabase': { + const { migrate } = await import('./singlestore/migrator'); + + return migrate(db as SingleStoreDriverDatabase, config as MigrationConfig); + } + } +} diff --git a/drizzle-orm/src/mysql-core/db.ts b/drizzle-orm/src/mysql-core/db.ts index 8df6ff343..8934c0edf 100644 --- a/drizzle-orm/src/mysql-core/db.ts +++ b/drizzle-orm/src/mysql-core/db.ts @@ -3,10 +3,11 @@ import { entityKind } from '~/entity.ts'; import type { TypedQueryBuilder } from '~/query-builders/query-builder.ts'; import type { ExtractTablesWithRelations, RelationalSchemaConfig, TablesRelationalConfig } from '~/relations.ts'; import { SelectionProxyHandler } from '~/selection-proxy.ts'; -import type { ColumnsSelection, SQLWrapper } from '~/sql/sql.ts'; +import type { ColumnsSelection, SQL, SQLWrapper } from '~/sql/sql.ts'; import { WithSubquery } from '~/subquery.ts'; import type { DrizzleTypeError } from '~/utils.ts'; import type { MySqlDialect } from './dialect.ts'; +import { MySqlCountBuilder } from './query-builders/count.ts'; import { MySqlDeleteBase, MySqlInsertBuilder, @@ -27,6 +28,7 @@ import type { } from './session.ts'; import type { WithSubqueryWithSelection } from './subquery.ts'; import type { MySqlTable } from './table.ts'; +import type { MySqlViewBase } from './view-base.ts'; export class MySqlDatabase< TQueryResult extends MySqlQueryResultHKT, @@ -134,6 +136,13 @@ export class MySqlDatabase< }; } + $count( + source: MySqlTable | MySqlViewBase | SQL | SQLWrapper, + filters?: SQL, + ) { + return new MySqlCountBuilder({ source, filters, session: this.session }); + } + /** * Incorporates a previously defined CTE (using `$with`) into the main query. * diff --git a/drizzle-orm/src/mysql-core/query-builders/count.ts b/drizzle-orm/src/mysql-core/query-builders/count.ts new file mode 100644 index 000000000..645bb4753 --- /dev/null +++ b/drizzle-orm/src/mysql-core/query-builders/count.ts @@ -0,0 +1,80 @@ +import { entityKind, sql } from '~/index.ts'; +import type { SQLWrapper } from '~/sql/sql.ts'; +import { SQL } from '~/sql/sql.ts'; +import type { MySqlSession } from '../session.ts'; +import type { MySqlTable } from '../table.ts'; +import type { MySqlViewBase } from '../view-base.ts'; + +export class MySqlCountBuilder< + TSession extends MySqlSession, +> extends SQL implements Promise, SQLWrapper { + private sql: SQL; + + static readonly [entityKind] = 'MySqlCountBuilder'; + [Symbol.toStringTag] = 'MySqlCountBuilder'; + + private session: TSession; + + private static buildEmbeddedCount( + source: MySqlTable | MySqlViewBase | SQL | SQLWrapper, + filters?: SQL, + ): SQL { + return sql`(select count(*) from ${source}${sql.raw(' where ').if(filters)}${filters})`; + } + + private static buildCount( + source: MySqlTable | MySqlViewBase | SQL | SQLWrapper, + filters?: SQL, + ): SQL { + return sql`select count(*) as count from ${source}${sql.raw(' where ').if(filters)}${filters}`; + } + + constructor( + readonly params: { + source: MySqlTable | MySqlViewBase | SQL | SQLWrapper; + filters?: SQL; + session: TSession; + }, + ) { + super(MySqlCountBuilder.buildEmbeddedCount(params.source, params.filters).queryChunks); + + this.mapWith(Number); + + this.session = params.session; + + this.sql = MySqlCountBuilder.buildCount( + params.source, + params.filters, + ); + } + + then( + onfulfilled?: ((value: number) => TResult1 | PromiseLike) | null | undefined, + onrejected?: ((reason: any) => TResult2 | PromiseLike) | null | undefined, + ): Promise { + return Promise.resolve(this.session.count(this.sql)) + .then( + onfulfilled, + onrejected, + ); + } + + catch( + onRejected?: ((reason: any) => never | PromiseLike) | null | undefined, + ): Promise { + return this.then(undefined, onRejected); + } + + finally(onFinally?: (() => void) | null | undefined): Promise { + return this.then( + (value) => { + onFinally?.(); + return value; + }, + (reason) => { + onFinally?.(); + throw reason; + }, + ); + } +} diff --git a/drizzle-orm/src/mysql-core/session.ts b/drizzle-orm/src/mysql-core/session.ts index 6b6269639..021d4276d 100644 --- a/drizzle-orm/src/mysql-core/session.ts +++ b/drizzle-orm/src/mysql-core/session.ts @@ -86,6 +86,14 @@ export abstract class MySqlSession< abstract all(query: SQL): Promise; + async count(sql: SQL): Promise { + const res = await this.execute<[[{ count: string }]]>(sql); + + return Number( + res[0][0]['count'], + ); + } + abstract transaction( transaction: (tx: MySqlTransaction) => Promise, config?: MySqlTransactionConfig, diff --git a/drizzle-orm/src/mysql-proxy/driver.ts b/drizzle-orm/src/mysql-proxy/driver.ts index 574db42c1..dfbf69cc9 100644 --- a/drizzle-orm/src/mysql-proxy/driver.ts +++ b/drizzle-orm/src/mysql-proxy/driver.ts @@ -1,3 +1,4 @@ +import { entityKind } from '~/entity.ts'; import { DefaultLogger } from '~/logger.ts'; import { MySqlDatabase } from '~/mysql-core/db.ts'; import { MySqlDialect } from '~/mysql-core/dialect.ts'; @@ -10,9 +11,11 @@ import { import type { DrizzleConfig } from '~/utils.ts'; import { type MySqlRemotePreparedQueryHKT, type MySqlRemoteQueryResultHKT, MySqlRemoteSession } from './session.ts'; -export type MySqlRemoteDatabase< +export class MySqlRemoteDatabase< TSchema extends Record = Record, -> = MySqlDatabase; +> extends MySqlDatabase { + static readonly [entityKind]: string = 'MySqlRemoteDatabase'; +} export type RemoteCallback = ( sql: string, @@ -46,5 +49,5 @@ export function drizzle = Record; + return new MySqlRemoteDatabase(dialect, session, schema as any, 'default') as MySqlRemoteDatabase; } diff --git a/drizzle-orm/src/mysql2/driver.ts b/drizzle-orm/src/mysql2/driver.ts index 3b21bf11d..a8fb65c3b 100644 --- a/drizzle-orm/src/mysql2/driver.ts +++ b/drizzle-orm/src/mysql2/driver.ts @@ -40,9 +40,11 @@ export class MySql2Driver { export { MySqlDatabase } from '~/mysql-core/db.ts'; -export type MySql2Database< +export class MySql2Database< TSchema extends Record = Record, -> = MySqlDatabase; +> extends MySqlDatabase { + static readonly [entityKind]: string = 'MySql2Database'; +} export type MySql2DrizzleConfig = Record> = & Omit, 'schema'> @@ -87,7 +89,7 @@ export function drizzle = Record; + return new MySql2Database(dialect, session, schema as any, mode) as MySql2Database; } interface CallbackClient { diff --git a/drizzle-orm/src/mysql2/migrator.ts b/drizzle-orm/src/mysql2/migrator.ts index 2f3c9c3dc..ae56e01f1 100644 --- a/drizzle-orm/src/mysql2/migrator.ts +++ b/drizzle-orm/src/mysql2/migrator.ts @@ -4,8 +4,15 @@ import type { MySql2Database } from './driver.ts'; export async function migrate>( db: MySql2Database, - config: MigrationConfig, + config: MigrationConfig | string, ) { const migrations = readMigrationFiles(config); - await db.dialect.migrate(migrations, db.session, config); + + const preparedConfig = typeof config === 'string' + ? { + migrationsFolder: config, + } + : config; + + await db.dialect.migrate(migrations, db.session, preparedConfig); } diff --git a/drizzle-orm/src/neon-http/session.ts b/drizzle-orm/src/neon-http/session.ts index 6d7685116..4dd768d3e 100644 --- a/drizzle-orm/src/neon-http/session.ts +++ b/drizzle-orm/src/neon-http/session.ts @@ -10,7 +10,7 @@ import type { PgQueryResultHKT, PgTransactionConfig, PreparedQueryConfig } from import { PgPreparedQuery as PgPreparedQuery, PgSession } from '~/pg-core/session.ts'; import type { RelationalSchemaConfig, TablesRelationalConfig } from '~/relations.ts'; import type { PreparedQuery } from '~/session.ts'; -import { fillPlaceholders, type Query } from '~/sql/sql.ts'; +import { fillPlaceholders, type Query, type SQL } from '~/sql/sql.ts'; import { mapResultRow } from '~/utils.ts'; export type NeonHttpClient = NeonQueryFunction; @@ -161,6 +161,14 @@ export class NeonHttpSession< return this.client(query, params, { arrayMode: false, fullResults: true }); } + override async count(sql: SQL): Promise { + const res = await this.execute<{ rows: [{ count: string }] }>(sql); + + return Number( + res['rows'][0]['count'], + ); + } + override async transaction( _transaction: (tx: NeonTransaction) => Promise, // eslint-disable-next-line @typescript-eslint/no-unused-vars diff --git a/drizzle-orm/src/neon-serverless/driver.ts b/drizzle-orm/src/neon-serverless/driver.ts index 8a15dd678..7f42cfeb3 100644 --- a/drizzle-orm/src/neon-serverless/driver.ts +++ b/drizzle-orm/src/neon-serverless/driver.ts @@ -43,9 +43,11 @@ export class NeonDriver { } } -export type NeonDatabase< +export class NeonDatabase< TSchema extends Record = Record, -> = PgDatabase; +> extends PgDatabase { + static readonly [entityKind]: string = 'NeonServerlessDatabase'; +} export function drizzle = Record>( client: NeonClient, @@ -74,5 +76,5 @@ export function drizzle = Record; + return new NeonDatabase(dialect, session, schema as any) as NeonDatabase; } diff --git a/drizzle-orm/src/node-postgres/driver.ts b/drizzle-orm/src/node-postgres/driver.ts index 4c233f891..15ac8fc06 100644 --- a/drizzle-orm/src/node-postgres/driver.ts +++ b/drizzle-orm/src/node-postgres/driver.ts @@ -45,9 +45,11 @@ export class NodePgDriver { } } -export type NodePgDatabase< +export class NodePgDatabase< TSchema extends Record = Record, -> = PgDatabase; +> extends PgDatabase { + static readonly [entityKind]: string = 'NodePgDatabase'; +} export function drizzle = Record>( client: NodePgClient, @@ -76,5 +78,5 @@ export function drizzle = Record; + return new NodePgDatabase(dialect, session, schema as any) as NodePgDatabase; } diff --git a/drizzle-orm/src/node-postgres/session.ts b/drizzle-orm/src/node-postgres/session.ts index 91a21312a..ef6779354 100644 --- a/drizzle-orm/src/node-postgres/session.ts +++ b/drizzle-orm/src/node-postgres/session.ts @@ -8,7 +8,7 @@ import type { SelectedFieldsOrdered } from '~/pg-core/query-builders/select.type import type { PgQueryResultHKT, PgTransactionConfig, PreparedQueryConfig } from '~/pg-core/session.ts'; import { PgPreparedQuery, PgSession } from '~/pg-core/session.ts'; import type { RelationalSchemaConfig, TablesRelationalConfig } from '~/relations.ts'; -import { fillPlaceholders, type Query, sql } from '~/sql/sql.ts'; +import { fillPlaceholders, type Query, type SQL, sql } from '~/sql/sql.ts'; import { tracer } from '~/tracing.ts'; import { type Assume, mapResultRow } from '~/utils.ts'; @@ -164,6 +164,13 @@ export class NodePgSession< } } } + + override async count(sql: SQL): Promise { + const res = await this.execute<{ rows: [{ count: string }] }>(sql); + return Number( + res['rows'][0]['count'], + ); + } } export class NodePgTransaction< diff --git a/drizzle-orm/src/op-sqlite/driver.ts b/drizzle-orm/src/op-sqlite/driver.ts index 24c663abf..94ee6e866 100644 --- a/drizzle-orm/src/op-sqlite/driver.ts +++ b/drizzle-orm/src/op-sqlite/driver.ts @@ -1,4 +1,5 @@ import type { OPSQLiteConnection, QueryResult } from '@op-engineering/op-sqlite'; +import { entityKind } from '~/entity.ts'; import { DefaultLogger } from '~/logger.ts'; import { createTableRelationsHelpers, @@ -11,9 +12,11 @@ import { SQLiteAsyncDialect } from '~/sqlite-core/dialect.ts'; import type { DrizzleConfig } from '~/utils.ts'; import { OPSQLiteSession } from './session.ts'; -export type OPSQLiteDatabase< +export class OPSQLiteDatabase< TSchema extends Record = Record, -> = BaseSQLiteDatabase<'async', QueryResult, TSchema>; +> extends BaseSQLiteDatabase<'async', QueryResult, TSchema> { + static readonly [entityKind]: string = 'OPSQLiteDatabase'; +} export function drizzle = Record>( client: OPSQLiteConnection, @@ -41,5 +44,5 @@ export function drizzle = Record; + return new OPSQLiteDatabase('async', dialect, session, schema) as OPSQLiteDatabase; } diff --git a/drizzle-orm/src/operations.ts b/drizzle-orm/src/operations.ts index 492bb3f2a..6fb5cbd2e 100644 --- a/drizzle-orm/src/operations.ts +++ b/drizzle-orm/src/operations.ts @@ -21,6 +21,7 @@ export type OptionalKeyOnly< : T['_']['generated'] extends object ? T['_']['generated']['type'] extends 'byDefault' ? TKey : never : never; +// TODO: SQL -> SQLWrapper export type SelectedFieldsFlat = Record< string, TColumn | SQL | SQL.Aliased diff --git a/drizzle-orm/src/pg-core/db.ts b/drizzle-orm/src/pg-core/db.ts index 4e8d2f354..62b64fb8f 100644 --- a/drizzle-orm/src/pg-core/db.ts +++ b/drizzle-orm/src/pg-core/db.ts @@ -19,15 +19,17 @@ import type { PgTable } from '~/pg-core/table.ts'; import type { TypedQueryBuilder } from '~/query-builders/query-builder.ts'; import type { ExtractTablesWithRelations, RelationalSchemaConfig, TablesRelationalConfig } from '~/relations.ts'; import { SelectionProxyHandler } from '~/selection-proxy.ts'; -import type { ColumnsSelection, SQLWrapper } from '~/sql/sql.ts'; +import type { ColumnsSelection, SQL, SQLWrapper } from '~/sql/sql.ts'; import { WithSubquery } from '~/subquery.ts'; import type { DrizzleTypeError } from '~/utils.ts'; import type { PgColumn } from './columns/index.ts'; +import { PgCountBuilder } from './query-builders/count.ts'; import { RelationalQueryBuilder } from './query-builders/query.ts'; import { PgRaw } from './query-builders/raw.ts'; import { PgRefreshMaterializedView } from './query-builders/refresh-materialized-view.ts'; import type { SelectedFields } from './query-builders/select.types.ts'; import type { WithSubqueryWithSelection } from './subquery.ts'; +import type { PgViewBase } from './view-base.ts'; import type { PgMaterializedView } from './view.ts'; export class PgDatabase< @@ -135,6 +137,13 @@ export class PgDatabase< }; } + $count( + source: PgTable | PgViewBase | SQL | SQLWrapper, + filters?: SQL, + ) { + return new PgCountBuilder({ source, filters, session: this.session }); + } + /** * Incorporates a previously defined CTE (using `$with`) into the main query. * @@ -622,7 +631,11 @@ export const withReplicas = < HKT extends PgQueryResultHKT, TFullSchema extends Record, TSchema extends TablesRelationalConfig, - Q extends PgDatabase, + Q extends PgDatabase< + HKT, + TFullSchema, + TSchema extends Record ? ExtractTablesWithRelations : TSchema + >, >( primary: Q, replicas: [Q, ...Q[]], diff --git a/drizzle-orm/src/pg-core/query-builders/count.ts b/drizzle-orm/src/pg-core/query-builders/count.ts new file mode 100644 index 000000000..c93cbb18d --- /dev/null +++ b/drizzle-orm/src/pg-core/query-builders/count.ts @@ -0,0 +1,79 @@ +import { entityKind, sql } from '~/index.ts'; +import type { SQLWrapper } from '~/sql/sql.ts'; +import { SQL } from '~/sql/sql.ts'; +import type { PgSession } from '../session.ts'; +import type { PgTable } from '../table.ts'; + +export class PgCountBuilder< + TSession extends PgSession, +> extends SQL implements Promise, SQLWrapper { + private sql: SQL; + + static readonly [entityKind] = 'PgCountBuilder'; + [Symbol.toStringTag] = 'PgCountBuilder'; + + private session: TSession; + + private static buildEmbeddedCount( + source: PgTable | SQL | SQLWrapper, + filters?: SQL, + ): SQL { + return sql`(select count(*) from ${source}${sql.raw(' where ').if(filters)}${filters})`; + } + + private static buildCount( + source: PgTable | SQL | SQLWrapper, + filters?: SQL, + ): SQL { + return sql`select count(*) as count from ${source}${sql.raw(' where ').if(filters)}${filters};`; + } + + constructor( + readonly params: { + source: PgTable | SQL | SQLWrapper; + filters?: SQL; + session: TSession; + }, + ) { + super(PgCountBuilder.buildEmbeddedCount(params.source, params.filters).queryChunks); + + this.mapWith(Number); + + this.session = params.session; + + this.sql = PgCountBuilder.buildCount( + params.source, + params.filters, + ); + } + + then( + onfulfilled?: ((value: number) => TResult1 | PromiseLike) | null | undefined, + onrejected?: ((reason: any) => TResult2 | PromiseLike) | null | undefined, + ): Promise { + return Promise.resolve(this.session.count(this.sql)) + .then( + onfulfilled, + onrejected, + ); + } + + catch( + onRejected?: ((reason: any) => never | PromiseLike) | null | undefined, + ): Promise { + return this.then(undefined, onRejected); + } + + finally(onFinally?: (() => void) | null | undefined): Promise { + return this.then( + (value) => { + onFinally?.(); + return value; + }, + (reason) => { + onFinally?.(); + throw reason; + }, + ); + } +} diff --git a/drizzle-orm/src/pg-core/session.ts b/drizzle-orm/src/pg-core/session.ts index 434ebc086..ea820f2d8 100644 --- a/drizzle-orm/src/pg-core/session.ts +++ b/drizzle-orm/src/pg-core/session.ts @@ -86,6 +86,14 @@ export abstract class PgSession< ).all(); } + async count(sql: SQL): Promise { + const res = await this.execute<[{ count: string }]>(sql); + + return Number( + res[0]['count'], + ); + } + abstract transaction( transaction: (tx: PgTransaction) => Promise, config?: PgTransactionConfig, diff --git a/drizzle-orm/src/pg-proxy/driver.ts b/drizzle-orm/src/pg-proxy/driver.ts index cdffa15c1..d82e86962 100644 --- a/drizzle-orm/src/pg-proxy/driver.ts +++ b/drizzle-orm/src/pg-proxy/driver.ts @@ -1,3 +1,4 @@ +import { entityKind } from '~/entity.ts'; import { DefaultLogger } from '~/logger.ts'; import { PgDatabase } from '~/pg-core/db.ts'; import { PgDialect } from '~/pg-core/dialect.ts'; @@ -10,9 +11,11 @@ import { import type { DrizzleConfig } from '~/utils.ts'; import { type PgRemoteQueryResultHKT, PgRemoteSession } from './session.ts'; -export type PgRemoteDatabase< +export class PgRemoteDatabase< TSchema extends Record = Record, -> = PgDatabase; +> extends PgDatabase { + static readonly [entityKind]: string = 'PgRemoteDatabase'; +} export type RemoteCallback = ( sql: string, @@ -48,5 +51,5 @@ export function drizzle = Record; + return new PgRemoteDatabase(dialect, session, schema as any) as PgRemoteDatabase; } diff --git a/drizzle-orm/src/pg-proxy/session.ts b/drizzle-orm/src/pg-proxy/session.ts index eb6a1b1a3..1a30c0a3c 100644 --- a/drizzle-orm/src/pg-proxy/session.ts +++ b/drizzle-orm/src/pg-proxy/session.ts @@ -130,7 +130,8 @@ export class PreparedQuery extends PreparedQueryB }); } - async all() {} + async all() { + } /** @internal */ isResponseInArrayMode(): boolean { diff --git a/drizzle-orm/src/pglite/driver.ts b/drizzle-orm/src/pglite/driver.ts index 7de2ce110..a801005d8 100644 --- a/drizzle-orm/src/pglite/driver.ts +++ b/drizzle-orm/src/pglite/driver.ts @@ -34,9 +34,11 @@ export class PgliteDriver { } } -export type PgliteDatabase< +export class PgliteDatabase< TSchema extends Record = Record, -> = PgDatabase; +> extends PgDatabase { + static readonly [entityKind]: string = 'PgliteDatabase'; +} export function drizzle = Record>( client: PgliteClient, @@ -65,5 +67,5 @@ export function drizzle = Record; + return new PgliteDatabase(dialect, session, schema as any) as PgliteDatabase; } diff --git a/drizzle-orm/src/pglite/session.ts b/drizzle-orm/src/pglite/session.ts index c7a1dbb5d..ebf7701a6 100644 --- a/drizzle-orm/src/pglite/session.ts +++ b/drizzle-orm/src/pglite/session.ts @@ -7,7 +7,7 @@ import type { SelectedFieldsOrdered } from '~/pg-core/query-builders/select.type import type { PgQueryResultHKT, PgTransactionConfig, PreparedQueryConfig } from '~/pg-core/session.ts'; import { PgPreparedQuery, PgSession } from '~/pg-core/session.ts'; import type { RelationalSchemaConfig, TablesRelationalConfig } from '~/relations.ts'; -import { fillPlaceholders, type Query, sql } from '~/sql/sql.ts'; +import { fillPlaceholders, type Query, type SQL, sql } from '~/sql/sql.ts'; import { type Assume, mapResultRow } from '~/utils.ts'; import { types } from '@electric-sql/pglite'; @@ -140,6 +140,13 @@ export class PgliteSession< return transaction(tx); }) as Promise; } + + override async count(sql: SQL): Promise { + const res = await this.execute<{ rows: [{ count: string }] }>(sql); + return Number( + res['rows'][0]['count'], + ); + } } export class PgliteTransaction< diff --git a/drizzle-orm/src/planetscale-serverless/driver.ts b/drizzle-orm/src/planetscale-serverless/driver.ts index b1d2d6e6f..fd1327bbc 100644 --- a/drizzle-orm/src/planetscale-serverless/driver.ts +++ b/drizzle-orm/src/planetscale-serverless/driver.ts @@ -1,5 +1,6 @@ import type { Connection } from '@planetscale/database'; import { Client } from '@planetscale/database'; +import { entityKind } from '~/entity.ts'; import type { Logger } from '~/logger.ts'; import { DefaultLogger } from '~/logger.ts'; import { MySqlDatabase } from '~/mysql-core/db.ts'; @@ -18,9 +19,11 @@ export interface PlanetscaleSDriverOptions { logger?: Logger; } -export type PlanetScaleDatabase< +export class PlanetScaleDatabase< TSchema extends Record = Record, -> = MySqlDatabase; +> extends MySqlDatabase { + static readonly [entityKind]: string = 'PlanetScaleDatabase'; +} export function drizzle = Record>( client: Client | Connection, @@ -82,5 +85,5 @@ Starting from version 0.30.0, you will encounter an error if you attempt to use } const session = new PlanetscaleSession(client, dialect, undefined, schema, { logger }); - return new MySqlDatabase(dialect, session, schema, 'planetscale') as PlanetScaleDatabase; + return new PlanetScaleDatabase(dialect, session, schema as any, 'planetscale') as PlanetScaleDatabase; } diff --git a/drizzle-orm/src/planetscale-serverless/migrator.ts b/drizzle-orm/src/planetscale-serverless/migrator.ts index 5a668ae01..8b3713602 100644 --- a/drizzle-orm/src/planetscale-serverless/migrator.ts +++ b/drizzle-orm/src/planetscale-serverless/migrator.ts @@ -4,8 +4,15 @@ import type { PlanetScaleDatabase } from './driver.ts'; export async function migrate>( db: PlanetScaleDatabase, - config: MigrationConfig, + config: MigrationConfig | string, ) { const migrations = readMigrationFiles(config); - await db.dialect.migrate(migrations, db.session, config); + + const preparedConfig = typeof config === 'string' + ? { + migrationsFolder: config, + } + : config; + + await db.dialect.migrate(migrations, db.session, preparedConfig); } diff --git a/drizzle-orm/src/planetscale-serverless/session.ts b/drizzle-orm/src/planetscale-serverless/session.ts index f2275b7f2..987529d7c 100644 --- a/drizzle-orm/src/planetscale-serverless/session.ts +++ b/drizzle-orm/src/planetscale-serverless/session.ts @@ -164,6 +164,14 @@ export class PlanetscaleSession< ) => eQuery.rows as T[]); } + override async count(sql: SQL): Promise { + const res = await this.execute<{ rows: [{ count: string }] }>(sql); + + return Number( + res['rows'][0]['count'], + ); + } + override transaction( transaction: (tx: PlanetScaleTransaction) => Promise, ): Promise { diff --git a/drizzle-orm/src/postgres-js/driver.ts b/drizzle-orm/src/postgres-js/driver.ts index 7f44344e8..6714cff8d 100644 --- a/drizzle-orm/src/postgres-js/driver.ts +++ b/drizzle-orm/src/postgres-js/driver.ts @@ -1,4 +1,5 @@ import type { Sql } from 'postgres'; +import { entityKind } from '~/entity.ts'; import { DefaultLogger } from '~/logger.ts'; import { PgDatabase } from '~/pg-core/db.ts'; import { PgDialect } from '~/pg-core/dialect.ts'; @@ -12,9 +13,11 @@ import type { DrizzleConfig } from '~/utils.ts'; import type { PostgresJsQueryResultHKT } from './session.ts'; import { PostgresJsSession } from './session.ts'; -export type PostgresJsDatabase< +export class PostgresJsDatabase< TSchema extends Record = Record, -> = PgDatabase; +> extends PgDatabase { + static readonly [entityKind]: string = 'PostgresJsDatabase'; +} export function drizzle = Record>( client: Sql, @@ -52,5 +55,5 @@ export function drizzle = Record; + return new PostgresJsDatabase(dialect, session, schema as any) as PostgresJsDatabase; } diff --git a/drizzle-orm/src/singlestore/driver.ts b/drizzle-orm/src/singlestore/driver.ts index 78837286f..ffc5c2795 100644 --- a/drizzle-orm/src/singlestore/driver.ts +++ b/drizzle-orm/src/singlestore/driver.ts @@ -12,6 +12,7 @@ import { SingleStoreDatabase } from '~/singlestore-core/db.ts'; import { SingleStoreDialect } from '~/singlestore-core/dialect.ts'; import type { DrizzleConfig } from '~/utils.ts'; import type { + Mode, SingleStoreDriverClient, SingleStoreDriverPreparedQueryHKT, SingleStoreDriverQueryResultHKT, @@ -45,6 +46,10 @@ export type SingleStoreDriverDatabase< TSchema extends Record = Record, > = SingleStoreDatabase; +export type SingleStoreDriverDrizzleConfig = Record> = + & Omit, 'schema'> + & ({ schema: TSchema; mode: Mode } | { schema?: undefined; mode?: Mode }); + export function drizzle = Record>( client: SingleStoreDriverClient | CallbackConnection | CallbackPool, config: DrizzleConfig = {}, diff --git a/drizzle-orm/src/singlestore/session.ts b/drizzle-orm/src/singlestore/session.ts index 932510d58..0ea143971 100644 --- a/drizzle-orm/src/singlestore/session.ts +++ b/drizzle-orm/src/singlestore/session.ts @@ -31,6 +31,9 @@ import { fillPlaceholders, sql } from '~/sql/sql.ts'; import type { Query, SQL } from '~/sql/sql.ts'; import { type Assume, mapResultRow } from '~/utils.ts'; +// must keep this type here for compatibility with DrizzleConfig +export type Mode = 'default'; + export type SingleStoreDriverClient = Pool | Connection; export type SingleStoreRawQueryResult = [ResultSetHeader, FieldPacket[]]; diff --git a/drizzle-orm/src/sqlite-core/db.ts b/drizzle-orm/src/sqlite-core/db.ts index 65f807d08..7ae2736e0 100644 --- a/drizzle-orm/src/sqlite-core/db.ts +++ b/drizzle-orm/src/sqlite-core/db.ts @@ -2,7 +2,7 @@ import { entityKind } from '~/entity.ts'; import type { TypedQueryBuilder } from '~/query-builders/query-builder.ts'; import type { ExtractTablesWithRelations, RelationalSchemaConfig, TablesRelationalConfig } from '~/relations.ts'; import { SelectionProxyHandler } from '~/selection-proxy.ts'; -import type { ColumnsSelection, SQLWrapper } from '~/sql/sql.ts'; +import type { ColumnsSelection, SQL, SQLWrapper } from '~/sql/sql.ts'; import type { SQLiteAsyncDialect, SQLiteSyncDialect } from '~/sqlite-core/dialect.ts'; import { QueryBuilder, @@ -21,10 +21,12 @@ import type { import type { SQLiteTable } from '~/sqlite-core/table.ts'; import { WithSubquery } from '~/subquery.ts'; import type { DrizzleTypeError } from '~/utils.ts'; +import { SQLiteCountBuilder } from './query-builders/count.ts'; import { RelationalQueryBuilder } from './query-builders/query.ts'; import { SQLiteRaw } from './query-builders/raw.ts'; import type { SelectedFields } from './query-builders/select.types.ts'; import type { WithSubqueryWithSelection } from './subquery.ts'; +import type { SQLiteViewBase } from './view-base.ts'; export class BaseSQLiteDatabase< TResultKind extends 'sync' | 'async', @@ -134,6 +136,13 @@ export class BaseSQLiteDatabase< }; } + $count( + source: SQLiteTable | SQLiteViewBase | SQL | SQLWrapper, + filters?: SQL, + ) { + return new SQLiteCountBuilder({ source, filters, session: this.session }); + } + /** * Incorporates a previously defined CTE (using `$with`) into the main query. * diff --git a/drizzle-orm/src/sqlite-core/query-builders/count.ts b/drizzle-orm/src/sqlite-core/query-builders/count.ts new file mode 100644 index 000000000..1b19eed07 --- /dev/null +++ b/drizzle-orm/src/sqlite-core/query-builders/count.ts @@ -0,0 +1,77 @@ +import { entityKind, sql } from '~/index.ts'; +import type { SQLWrapper } from '~/sql/sql.ts'; +import { SQL } from '~/sql/sql.ts'; +import type { SQLiteSession } from '../session.ts'; +import type { SQLiteTable } from '../table.ts'; +import type { SQLiteView } from '../view.ts'; + +export class SQLiteCountBuilder< + TSession extends SQLiteSession, +> extends SQL implements Promise, SQLWrapper { + private sql: SQL; + + static readonly [entityKind] = 'SQLiteCountBuilderAsync'; + [Symbol.toStringTag] = 'SQLiteCountBuilderAsync'; + + private session: TSession; + + private static buildEmbeddedCount( + source: SQLiteTable | SQLiteView | SQL | SQLWrapper, + filters?: SQL, + ): SQL { + return sql`(select count(*) from ${source}${sql.raw(' where ').if(filters)}${filters})`; + } + + private static buildCount( + source: SQLiteTable | SQLiteView | SQL | SQLWrapper, + filters?: SQL, + ): SQL { + return sql`select count(*) from ${source}${sql.raw(' where ').if(filters)}${filters}`; + } + + constructor( + readonly params: { + source: SQLiteTable | SQLiteView | SQL | SQLWrapper; + filters?: SQL; + session: TSession; + }, + ) { + super(SQLiteCountBuilder.buildEmbeddedCount(params.source, params.filters).queryChunks); + + this.session = params.session; + + this.sql = SQLiteCountBuilder.buildCount( + params.source, + params.filters, + ); + } + + then( + onfulfilled?: ((value: number) => TResult1 | PromiseLike) | null | undefined, + onrejected?: ((reason: any) => TResult2 | PromiseLike) | null | undefined, + ): Promise { + return Promise.resolve(this.session.count(this.sql)).then( + onfulfilled, + onrejected, + ); + } + + catch( + onRejected?: ((reason: any) => never | PromiseLike) | null | undefined, + ): Promise { + return this.then(undefined, onRejected); + } + + finally(onFinally?: (() => void) | null | undefined): Promise { + return this.then( + (value) => { + onFinally?.(); + return value; + }, + (reason) => { + onFinally?.(); + throw reason; + }, + ); + } +} diff --git a/drizzle-orm/src/sqlite-core/session.ts b/drizzle-orm/src/sqlite-core/session.ts index 4ac987b4a..d291b6fdf 100644 --- a/drizzle-orm/src/sqlite-core/session.ts +++ b/drizzle-orm/src/sqlite-core/session.ts @@ -187,6 +187,12 @@ export abstract class SQLiteSession< >; } + async count(sql: SQL) { + const result = await this.values(sql) as [[number]]; + + return result[0][0]; + } + /** @internal */ extractRawValuesValueFromBatchResult(_result: unknown): unknown { throw new Error('Not implemented'); diff --git a/drizzle-orm/src/tidb-serverless/driver.ts b/drizzle-orm/src/tidb-serverless/driver.ts index bdd5324db..b762bd889 100644 --- a/drizzle-orm/src/tidb-serverless/driver.ts +++ b/drizzle-orm/src/tidb-serverless/driver.ts @@ -1,4 +1,5 @@ import type { Connection } from '@tidbcloud/serverless'; +import { entityKind } from '~/entity.ts'; import type { Logger } from '~/logger.ts'; import { DefaultLogger } from '~/logger.ts'; import { MySqlDatabase } from '~/mysql-core/db.ts'; @@ -17,9 +18,11 @@ export interface TiDBServerlessSDriverOptions { logger?: Logger; } -export type TiDBServerlessDatabase< +export class TiDBServerlessDatabase< TSchema extends Record = Record, -> = MySqlDatabase; +> extends MySqlDatabase { + static readonly [entityKind]: string = 'TiDBServerlessDatabase'; +} export function drizzle = Record>( client: Connection, @@ -47,5 +50,5 @@ export function drizzle = Record; + return new TiDBServerlessDatabase(dialect, session, schema as any, 'default') as TiDBServerlessDatabase; } diff --git a/drizzle-orm/src/tidb-serverless/session.ts b/drizzle-orm/src/tidb-serverless/session.ts index 64a8d61d7..b01b9f948 100644 --- a/drizzle-orm/src/tidb-serverless/session.ts +++ b/drizzle-orm/src/tidb-serverless/session.ts @@ -139,6 +139,14 @@ export class TiDBServerlessSession< return this.client.execute(querySql.sql, querySql.params) as Promise; } + override async count(sql: SQL): Promise { + const res = await this.execute<{ rows: [{ count: string }] }>(sql); + + return Number( + res['rows'][0]['count'], + ); + } + override async transaction( transaction: (tx: TiDBServerlessTransaction) => Promise, ): Promise { diff --git a/drizzle-orm/src/vercel-postgres/driver.ts b/drizzle-orm/src/vercel-postgres/driver.ts index 07e73c732..bc990d0b3 100644 --- a/drizzle-orm/src/vercel-postgres/driver.ts +++ b/drizzle-orm/src/vercel-postgres/driver.ts @@ -42,9 +42,11 @@ export class VercelPgDriver { } } -export type VercelPgDatabase< +export class VercelPgDatabase< TSchema extends Record = Record, -> = PgDatabase; +> extends PgDatabase { + static readonly [entityKind]: string = 'VercelPgDatabase'; +} export function drizzle = Record>( client: VercelPgClient, @@ -73,5 +75,5 @@ export function drizzle = Record; + return new VercelPgDatabase(dialect, session, schema as any) as VercelPgDatabase; } diff --git a/drizzle-orm/src/version.ts b/drizzle-orm/src/version.ts index 0c11937c8..d670a0575 100644 --- a/drizzle-orm/src/version.ts +++ b/drizzle-orm/src/version.ts @@ -1,4 +1,4 @@ // @ts-ignore - imported using Rollup json plugin export { version as npmVersion } from '../package.json'; // In version 7, we changed the PostgreSQL indexes API -export const compatibilityVersion = 7; +export const compatibilityVersion = 8; diff --git a/drizzle-orm/type-tests/mysql/count.ts b/drizzle-orm/type-tests/mysql/count.ts new file mode 100644 index 000000000..d9b9ba9ff --- /dev/null +++ b/drizzle-orm/type-tests/mysql/count.ts @@ -0,0 +1,61 @@ +import { Expect } from 'type-tests/utils.ts'; +import { and, gt, ne } from '~/expressions.ts'; +import { int, mysqlTable, serial, text } from '~/mysql-core/index.ts'; +import type { Equal } from '~/utils.ts'; +import { db } from './db.ts'; + +const names = mysqlTable('names', { + id: serial('id').primaryKey(), + name: text('name'), + authorId: int('author_id'), +}); + +const separate = await db.$count(names); + +const separateFilters = await db.$count(names, and(gt(names.id, 1), ne(names.name, 'forbidden'))); + +const embedded = await db + .select({ + id: names.id, + name: names.name, + authorId: names.authorId, + count1: db.$count(names).as('count1'), + }) + .from(names); + +const embeddedFilters = await db + .select({ + id: names.id, + name: names.name, + authorId: names.authorId, + count1: db.$count(names, and(gt(names.id, 1), ne(names.name, 'forbidden'))).as('count1'), + }) + .from(names); + +Expect>; + +Expect>; + +Expect< + Equal< + { + id: number; + name: string | null; + authorId: number | null; + count1: number; + }[], + typeof embedded + > +>; + +Expect< + Equal< + { + id: number; + name: string | null; + authorId: number | null; + count1: number; + }[], + typeof embeddedFilters + > +>; diff --git a/drizzle-orm/type-tests/pg/count.ts b/drizzle-orm/type-tests/pg/count.ts new file mode 100644 index 000000000..9ed5eeaf9 --- /dev/null +++ b/drizzle-orm/type-tests/pg/count.ts @@ -0,0 +1,61 @@ +import { Expect } from 'type-tests/utils.ts'; +import { and, gt, ne } from '~/expressions.ts'; +import { integer, pgTable, serial, text } from '~/pg-core/index.ts'; +import type { Equal } from '~/utils.ts'; +import { db } from './db.ts'; + +const names = pgTable('names', { + id: serial('id').primaryKey(), + name: text('name'), + authorId: integer('author_id'), +}); + +const separate = await db.$count(names); + +const separateFilters = await db.$count(names, and(gt(names.id, 1), ne(names.name, 'forbidden'))); + +const embedded = await db + .select({ + id: names.id, + name: names.name, + authorId: names.authorId, + count1: db.$count(names).as('count1'), + }) + .from(names); + +const embeddedFilters = await db + .select({ + id: names.id, + name: names.name, + authorId: names.authorId, + count1: db.$count(names, and(gt(names.id, 1), ne(names.name, 'forbidden'))).as('count1'), + }) + .from(names); + +Expect>; + +Expect>; + +Expect< + Equal< + { + id: number; + name: string | null; + authorId: number | null; + count1: number; + }[], + typeof embedded + > +>; + +Expect< + Equal< + { + id: number; + name: string | null; + authorId: number | null; + count1: number; + }[], + typeof embeddedFilters + > +>; diff --git a/drizzle-orm/type-tests/sqlite/count.ts b/drizzle-orm/type-tests/sqlite/count.ts new file mode 100644 index 000000000..04350f000 --- /dev/null +++ b/drizzle-orm/type-tests/sqlite/count.ts @@ -0,0 +1,61 @@ +import { Expect } from 'type-tests/utils.ts'; +import { and, gt, ne } from '~/expressions.ts'; +import { integer, sqliteTable, text } from '~/sqlite-core/index.ts'; +import type { Equal } from '~/utils.ts'; +import { db } from './db.ts'; + +const names = sqliteTable('names', { + id: integer('id').primaryKey(), + name: text('name'), + authorId: integer('author_id'), +}); + +const separate = await db.$count(names); + +const separateFilters = await db.$count(names, and(gt(names.id, 1), ne(names.name, 'forbidden'))); + +const embedded = await db + .select({ + id: names.id, + name: names.name, + authorId: names.authorId, + count1: db.$count(names).as('count1'), + }) + .from(names); + +const embeddedFilters = await db + .select({ + id: names.id, + name: names.name, + authorId: names.authorId, + count1: db.$count(names, and(gt(names.id, 1), ne(names.name, 'forbidden'))).as('count1'), + }) + .from(names); + +Expect>; + +Expect>; + +Expect< + Equal< + { + id: number; + name: string | null; + authorId: number | null; + count1: number; + }[], + typeof embedded + > +>; + +Expect< + Equal< + { + id: number; + name: string | null; + authorId: number | null; + count1: number; + }[], + typeof embeddedFilters + > +>; diff --git a/integration-tests/package.json b/integration-tests/package.json index a4fcab0b2..78f36fe30 100644 --- a/integration-tests/package.json +++ b/integration-tests/package.json @@ -15,6 +15,7 @@ "license": "Apache-2.0", "private": true, "devDependencies": { + "@libsql/client": "^0.10.0", "@neondatabase/serverless": "0.9.0", "@originjs/vite-plugin-commonjs": "^1.0.3", "@paralleldrive/cuid2": "^2.2.2", @@ -41,7 +42,6 @@ "@aws-sdk/client-rds-data": "^3.549.0", "@aws-sdk/credential-providers": "^3.549.0", "@electric-sql/pglite": "^0.1.1", - "@libsql/client": "^0.5.6", "@miniflare/d1": "^2.14.2", "@miniflare/shared": "^2.14.2", "@planetscale/database": "^1.16.0", diff --git a/integration-tests/tests/mysql/mysql-common.ts b/integration-tests/tests/mysql/mysql-common.ts index 58f7a1e2c..8a2fb768b 100644 --- a/integration-tests/tests/mysql/mysql-common.ts +++ b/integration-tests/tests/mysql/mysql-common.ts @@ -3577,29 +3577,237 @@ export function tests(driver?: string) { await db.execute(sql`drop view ${newYorkers1}`); }); - }); - test('limit 0', async (ctx) => { - const { db } = ctx.mysql; + test('$count separate', async (ctx) => { + const { db } = ctx.mysql; - await db.insert(usersTable).values({ name: 'John' }); - const users = await db - .select() - .from(usersTable) - .limit(0); + const countTestTable = mysqlTable('count_test', { + id: int('id').notNull(), + name: text('name').notNull(), + }); - expect(users).toEqual([]); - }); + await db.execute(sql`drop table if exists ${countTestTable}`); + await db.execute(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = await db.$count(countTestTable); + + await db.execute(sql`drop table ${countTestTable}`); + + expect(count).toStrictEqual(4); + }); + + test('$count embedded', async (ctx) => { + const { db } = ctx.mysql; + + const countTestTable = mysqlTable('count_test', { + id: int('id').notNull(), + name: text('name').notNull(), + }); + + await db.execute(sql`drop table if exists ${countTestTable}`); + await db.execute(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = await db.select({ + count: db.$count(countTestTable), + }).from(countTestTable); + + await db.execute(sql`drop table ${countTestTable}`); + + expect(count).toStrictEqual([ + { count: 4 }, + { count: 4 }, + { count: 4 }, + { count: 4 }, + ]); + }); + + test('$count separate reuse', async (ctx) => { + const { db } = ctx.mysql; + + const countTestTable = mysqlTable('count_test', { + id: int('id').notNull(), + name: text('name').notNull(), + }); + + await db.execute(sql`drop table if exists ${countTestTable}`); + await db.execute(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = db.$count(countTestTable); + + const count1 = await count; + + await db.insert(countTestTable).values({ id: 5, name: 'fifth' }); + + const count2 = await count; + + await db.insert(countTestTable).values({ id: 6, name: 'sixth' }); + + const count3 = await count; + + await db.execute(sql`drop table ${countTestTable}`); + + expect(count1).toStrictEqual(4); + expect(count2).toStrictEqual(5); + expect(count3).toStrictEqual(6); + }); + + test('$count embedded reuse', async (ctx) => { + const { db } = ctx.mysql; + + const countTestTable = mysqlTable('count_test', { + id: int('id').notNull(), + name: text('name').notNull(), + }); + + await db.execute(sql`drop table if exists ${countTestTable}`); + await db.execute(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); - test('limit -1', async (ctx) => { - const { db } = ctx.mysql; + const count = db.select({ + count: db.$count(countTestTable), + }).from(countTestTable); - await db.insert(usersTable).values({ name: 'John' }); - const users = await db - .select() - .from(usersTable) - .limit(-1); + const count1 = await count; - expect(users.length).toBeGreaterThan(0); + await db.insert(countTestTable).values({ id: 5, name: 'fifth' }); + + const count2 = await count; + + await db.insert(countTestTable).values({ id: 6, name: 'sixth' }); + + const count3 = await count; + + await db.execute(sql`drop table ${countTestTable}`); + + expect(count1).toStrictEqual([ + { count: 4 }, + { count: 4 }, + { count: 4 }, + { count: 4 }, + ]); + expect(count2).toStrictEqual([ + { count: 5 }, + { count: 5 }, + { count: 5 }, + { count: 5 }, + { count: 5 }, + ]); + expect(count3).toStrictEqual([ + { count: 6 }, + { count: 6 }, + { count: 6 }, + { count: 6 }, + { count: 6 }, + { count: 6 }, + ]); + }); + + test('$count separate with filters', async (ctx) => { + const { db } = ctx.mysql; + + const countTestTable = mysqlTable('count_test', { + id: int('id').notNull(), + name: text('name').notNull(), + }); + + await db.execute(sql`drop table if exists ${countTestTable}`); + await db.execute(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = await db.$count(countTestTable, gt(countTestTable.id, 1)); + + await db.execute(sql`drop table ${countTestTable}`); + + expect(count).toStrictEqual(3); + }); + + test('$count embedded with filters', async (ctx) => { + const { db } = ctx.mysql; + + const countTestTable = mysqlTable('count_test', { + id: int('id').notNull(), + name: text('name').notNull(), + }); + + await db.execute(sql`drop table if exists ${countTestTable}`); + await db.execute(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = await db.select({ + count: db.$count(countTestTable, gt(countTestTable.id, 1)), + }).from(countTestTable); + + await db.execute(sql`drop table ${countTestTable}`); + + expect(count).toStrictEqual([ + { count: 3 }, + { count: 3 }, + { count: 3 }, + { count: 3 }, + ]); + }); + + test('limit 0', async (ctx) => { + const { db } = ctx.mysql; + + await db.insert(usersTable).values({ name: 'John' }); + const users = await db + .select() + .from(usersTable) + .limit(0); + + expect(users).toEqual([]); + }); + + test('limit -1', async (ctx) => { + const { db } = ctx.mysql; + + await db.insert(usersTable).values({ name: 'John' }); + const users = await db + .select() + .from(usersTable) + .limit(-1); + + expect(users.length).toBeGreaterThan(0); + }); }); } diff --git a/integration-tests/tests/pg/awsdatapi.test.ts b/integration-tests/tests/pg/awsdatapi.test.ts index 8ee39cf12..3bb884c0c 100644 --- a/integration-tests/tests/pg/awsdatapi.test.ts +++ b/integration-tests/tests/pg/awsdatapi.test.ts @@ -799,19 +799,18 @@ test('migrator : default migration strategy', async () => { }); test('migrator : migrate with custom schema', async () => { - const customSchema = randomString(); await db.execute(sql`drop table if exists all_columns`); await db.execute(sql`drop table if exists users12`); await db.execute(sql`drop table if exists "drizzle"."__drizzle_migrations"`); await migrate(db, { migrationsFolder: './drizzle2/pg', - migrationsSchema: customSchema, + migrationsSchema: 'custom_migrations', }); // test if the custom migrations table was created const { rows } = await db.execute( - sql`select * from ${sql.identifier(customSchema)}."__drizzle_migrations";`, + sql`select * from custom_migrations."__drizzle_migrations";`, ); expect(rows).toBeTruthy(); expect(rows!.length).toBeGreaterThan(0); @@ -824,7 +823,7 @@ test('migrator : migrate with custom schema', async () => { await db.execute(sql`drop table all_columns`); await db.execute(sql`drop table users12`); await db.execute( - sql`drop table ${sql.identifier(customSchema)}."__drizzle_migrations"`, + sql`drop table custom_migrations."__drizzle_migrations"`, ); }); @@ -858,7 +857,6 @@ test('migrator : migrate with custom table', async () => { test('migrator : migrate with custom table and custom schema', async () => { const customTable = randomString(); - const customSchema = randomString(); await db.execute(sql`drop table if exists all_columns`); await db.execute(sql`drop table if exists users12`); await db.execute(sql`drop table if exists "drizzle"."__drizzle_migrations"`); @@ -866,12 +864,12 @@ test('migrator : migrate with custom table and custom schema', async () => { await migrate(db, { migrationsFolder: './drizzle2/pg', migrationsTable: customTable, - migrationsSchema: customSchema, + migrationsSchema: 'custom_migrations', }); // test if the custom migrations table was created const { rows } = await db.execute( - sql`select * from ${sql.identifier(customSchema)}.${ + sql`select * from custom_migrations.${ sql.identifier( customTable, ) @@ -888,7 +886,7 @@ test('migrator : migrate with custom table and custom schema', async () => { await db.execute(sql`drop table all_columns`); await db.execute(sql`drop table users12`); await db.execute( - sql`drop table ${sql.identifier(customSchema)}.${ + sql`drop table custom_migrations.${ sql.identifier( customTable, ) diff --git a/integration-tests/tests/pg/neon-http.test.ts b/integration-tests/tests/pg/neon-http.test.ts index 1476e9628..319c84f40 100644 --- a/integration-tests/tests/pg/neon-http.test.ts +++ b/integration-tests/tests/pg/neon-http.test.ts @@ -68,15 +68,14 @@ test('migrator : default migration strategy', async () => { }); test('migrator : migrate with custom schema', async () => { - const customSchema = randomString(); await db.execute(sql`drop table if exists all_columns`); await db.execute(sql`drop table if exists users12`); await db.execute(sql`drop table if exists "drizzle"."__drizzle_migrations"`); - await migrate(db, { migrationsFolder: './drizzle2/pg', migrationsSchema: customSchema }); + await migrate(db, { migrationsFolder: './drizzle2/pg', migrationsSchema: 'custom_migrations' }); // test if the custom migrations table was created - const { rowCount } = await db.execute(sql`select * from ${sql.identifier(customSchema)}."__drizzle_migrations";`); + const { rowCount } = await db.execute(sql`select * from custom_migrations."__drizzle_migrations";`); expect(rowCount && rowCount > 0).toBeTruthy(); // test if the migrated table are working as expected @@ -86,7 +85,7 @@ test('migrator : migrate with custom schema', async () => { await db.execute(sql`drop table all_columns`); await db.execute(sql`drop table users12`); - await db.execute(sql`drop table ${sql.identifier(customSchema)}."__drizzle_migrations"`); + await db.execute(sql`drop table custom_migrations."__drizzle_migrations"`); }); test('migrator : migrate with custom table', async () => { @@ -113,7 +112,6 @@ test('migrator : migrate with custom table', async () => { test('migrator : migrate with custom table and custom schema', async () => { const customTable = randomString(); - const customSchema = randomString(); await db.execute(sql`drop table if exists all_columns`); await db.execute(sql`drop table if exists users12`); await db.execute(sql`drop table if exists "drizzle"."__drizzle_migrations"`); @@ -121,12 +119,12 @@ test('migrator : migrate with custom table and custom schema', async () => { await migrate(db, { migrationsFolder: './drizzle2/pg', migrationsTable: customTable, - migrationsSchema: customSchema, + migrationsSchema: 'custom_migrations', }); // test if the custom migrations table was created const { rowCount } = await db.execute( - sql`select * from ${sql.identifier(customSchema)}.${sql.identifier(customTable)};`, + sql`select * from custom_migrations.${sql.identifier(customTable)};`, ); expect(rowCount && rowCount > 0).toBeTruthy(); @@ -137,7 +135,7 @@ test('migrator : migrate with custom table and custom schema', async () => { await db.execute(sql`drop table all_columns`); await db.execute(sql`drop table users12`); - await db.execute(sql`drop table ${sql.identifier(customSchema)}.${sql.identifier(customTable)}`); + await db.execute(sql`drop table custom_migrations.${sql.identifier(customTable)}`); }); test('all date and time columns without timezone first case mode string', async () => { diff --git a/integration-tests/tests/pg/pg-common.ts b/integration-tests/tests/pg/pg-common.ts index c48a533f9..8550f5ae4 100644 --- a/integration-tests/tests/pg/pg-common.ts +++ b/integration-tests/tests/pg/pg-common.ts @@ -74,7 +74,7 @@ import { } from 'drizzle-orm/pg-core'; import getPort from 'get-port'; import { v4 as uuidV4 } from 'uuid'; -import { afterAll, beforeEach, describe, expect, test } from 'vitest'; +import { afterAll, afterEach, beforeEach, describe, expect, test } from 'vitest'; import { Expect } from '~/utils'; import type { schema } from './neon-http-batch.test'; // eslint-disable-next-line @typescript-eslint/no-import-type-side-effects @@ -246,6 +246,7 @@ export function tests() { await db.execute(sql`drop schema if exists public cascade`); await db.execute(sql`drop schema if exists ${mySchema} cascade`); await db.execute(sql`create schema public`); + await db.execute(sql`create schema if not exists custom_migrations`); await db.execute(sql`create schema ${mySchema}`); // public users await db.execute( @@ -377,6 +378,11 @@ export function tests() { ); }); + afterEach(async (ctx) => { + const { db } = ctx.pg; + await db.execute(sql`drop schema if exists custom_migrations cascade`); + }); + async function setupSetOperationTest(db: PgDatabase) { await db.execute(sql`drop table if exists users2`); await db.execute(sql`drop table if exists cities`); @@ -4660,5 +4666,213 @@ export function tests() { jsonbNumberField: testNumber, }]); }); + + test('$count separate', async (ctx) => { + const { db } = ctx.pg; + + const countTestTable = pgTable('count_test', { + id: integer('id').notNull(), + name: text('name').notNull(), + }); + + await db.execute(sql`drop table if exists ${countTestTable}`); + await db.execute(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = await db.$count(countTestTable); + + await db.execute(sql`drop table ${countTestTable}`); + + expect(count).toStrictEqual(4); + }); + + test('$count embedded', async (ctx) => { + const { db } = ctx.pg; + + const countTestTable = pgTable('count_test', { + id: integer('id').notNull(), + name: text('name').notNull(), + }); + + await db.execute(sql`drop table if exists ${countTestTable}`); + await db.execute(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = await db.select({ + count: db.$count(countTestTable), + }).from(countTestTable); + + await db.execute(sql`drop table ${countTestTable}`); + + expect(count).toStrictEqual([ + { count: 4 }, + { count: 4 }, + { count: 4 }, + { count: 4 }, + ]); + }); + + test('$count separate reuse', async (ctx) => { + const { db } = ctx.pg; + + const countTestTable = pgTable('count_test', { + id: integer('id').notNull(), + name: text('name').notNull(), + }); + + await db.execute(sql`drop table if exists ${countTestTable}`); + await db.execute(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = db.$count(countTestTable); + + const count1 = await count; + + await db.insert(countTestTable).values({ id: 5, name: 'fifth' }); + + const count2 = await count; + + await db.insert(countTestTable).values({ id: 6, name: 'sixth' }); + + const count3 = await count; + + await db.execute(sql`drop table ${countTestTable}`); + + expect(count1).toStrictEqual(4); + expect(count2).toStrictEqual(5); + expect(count3).toStrictEqual(6); + }); + + test('$count embedded reuse', async (ctx) => { + const { db } = ctx.pg; + + const countTestTable = pgTable('count_test', { + id: integer('id').notNull(), + name: text('name').notNull(), + }); + + await db.execute(sql`drop table if exists ${countTestTable}`); + await db.execute(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = db.select({ + count: db.$count(countTestTable), + }).from(countTestTable); + + const count1 = await count; + + await db.insert(countTestTable).values({ id: 5, name: 'fifth' }); + + const count2 = await count; + + await db.insert(countTestTable).values({ id: 6, name: 'sixth' }); + + const count3 = await count; + + await db.execute(sql`drop table ${countTestTable}`); + + expect(count1).toStrictEqual([ + { count: 4 }, + { count: 4 }, + { count: 4 }, + { count: 4 }, + ]); + expect(count2).toStrictEqual([ + { count: 5 }, + { count: 5 }, + { count: 5 }, + { count: 5 }, + { count: 5 }, + ]); + expect(count3).toStrictEqual([ + { count: 6 }, + { count: 6 }, + { count: 6 }, + { count: 6 }, + { count: 6 }, + { count: 6 }, + ]); + }); + + test('$count separate with filters', async (ctx) => { + const { db } = ctx.pg; + + const countTestTable = pgTable('count_test', { + id: integer('id').notNull(), + name: text('name').notNull(), + }); + + await db.execute(sql`drop table if exists ${countTestTable}`); + await db.execute(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = await db.$count(countTestTable, gt(countTestTable.id, 1)); + + await db.execute(sql`drop table ${countTestTable}`); + + expect(count).toStrictEqual(3); + }); + + test('$count embedded with filters', async (ctx) => { + const { db } = ctx.pg; + + const countTestTable = pgTable('count_test', { + id: integer('id').notNull(), + name: text('name').notNull(), + }); + + await db.execute(sql`drop table if exists ${countTestTable}`); + await db.execute(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = await db.select({ + count: db.$count(countTestTable, gt(countTestTable.id, 1)), + }).from(countTestTable); + + await db.execute(sql`drop table ${countTestTable}`); + + expect(count).toStrictEqual([ + { count: 3 }, + { count: 3 }, + { count: 3 }, + { count: 3 }, + ]); + }); }); } diff --git a/integration-tests/tests/pg/vercel-pg.test.ts b/integration-tests/tests/pg/vercel-pg.test.ts index 3f1248d9b..ecf1d22ac 100644 --- a/integration-tests/tests/pg/vercel-pg.test.ts +++ b/integration-tests/tests/pg/vercel-pg.test.ts @@ -77,15 +77,14 @@ test('migrator : default migration strategy', async () => { }); test('migrator : migrate with custom schema', async () => { - const customSchema = randomString(); await db.execute(sql`drop table if exists all_columns`); await db.execute(sql`drop table if exists users12`); await db.execute(sql`drop table if exists "drizzle"."__drizzle_migrations"`); - await migrate(db, { migrationsFolder: './drizzle2/pg', migrationsSchema: customSchema }); + await migrate(db, { migrationsFolder: './drizzle2/pg', migrationsSchema: 'custom_migrations' }); // test if the custom migrations table was created - const { rowCount } = await db.execute(sql`select * from ${sql.identifier(customSchema)}."__drizzle_migrations";`); + const { rowCount } = await db.execute(sql`select * from custom_migrations."__drizzle_migrations";`); expect(rowCount && rowCount > 0).toBeTruthy(); // test if the migrated table are working as expected @@ -95,7 +94,7 @@ test('migrator : migrate with custom schema', async () => { await db.execute(sql`drop table all_columns`); await db.execute(sql`drop table users12`); - await db.execute(sql`drop table ${sql.identifier(customSchema)}."__drizzle_migrations"`); + await db.execute(sql`drop table custom_migrations."__drizzle_migrations"`); }); test('migrator : migrate with custom table', async () => { @@ -122,7 +121,6 @@ test('migrator : migrate with custom table', async () => { test('migrator : migrate with custom table and custom schema', async () => { const customTable = randomString(); - const customSchema = randomString(); await db.execute(sql`drop table if exists all_columns`); await db.execute(sql`drop table if exists users12`); await db.execute(sql`drop table if exists "drizzle"."__drizzle_migrations"`); @@ -130,12 +128,12 @@ test('migrator : migrate with custom table and custom schema', async () => { await migrate(db, { migrationsFolder: './drizzle2/pg', migrationsTable: customTable, - migrationsSchema: customSchema, + migrationsSchema: 'custom_migrations', }); // test if the custom migrations table was created const { rowCount } = await db.execute( - sql`select * from ${sql.identifier(customSchema)}.${sql.identifier(customTable)};`, + sql`select * from custom_migrations.${sql.identifier(customTable)};`, ); expect(rowCount && rowCount > 0).toBeTruthy(); @@ -146,7 +144,7 @@ test('migrator : migrate with custom table and custom schema', async () => { await db.execute(sql`drop table all_columns`); await db.execute(sql`drop table users12`); - await db.execute(sql`drop table ${sql.identifier(customSchema)}.${sql.identifier(customTable)}`); + await db.execute(sql`drop table custom_migrations.${sql.identifier(customTable)}`); }); test('all date and time columns without timezone first case mode string', async () => { diff --git a/integration-tests/tests/relational/singlestore.schema.ts b/integration-tests/tests/relational/singlestore.schema.ts new file mode 100644 index 000000000..ca3386ba0 --- /dev/null +++ b/integration-tests/tests/relational/singlestore.schema.ts @@ -0,0 +1,106 @@ +import { bigint, boolean, primaryKey, serial, singlestoreTable, text, timestamp } from 'drizzle-orm/singlestore-core'; + +import { relations } from 'drizzle-orm'; + +export const usersTable = singlestoreTable('users', { + id: serial('id').primaryKey(), + name: text('name').notNull(), + verified: boolean('verified').notNull().default(false), + invitedBy: bigint('invited_by', { mode: 'number' }), +}); +export const usersConfig = relations(usersTable, ({ one, many }) => ({ + invitee: one(usersTable, { + fields: [usersTable.invitedBy], + references: [usersTable.id], + }), + usersToGroups: many(usersToGroupsTable), + posts: many(postsTable), + comments: many(commentsTable), +})); + +export const groupsTable = singlestoreTable('groups', { + id: serial('id').primaryKey(), + name: text('name').notNull(), + description: text('description'), +}); +export const groupsConfig = relations(groupsTable, ({ many }) => ({ + usersToGroups: many(usersToGroupsTable), +})); + +export const usersToGroupsTable = singlestoreTable( + 'users_to_groups', + { + id: serial('id').primaryKey(), + userId: bigint('user_id', { mode: 'number' }).notNull(), + groupId: bigint('group_id', { mode: 'number' }).notNull(), + }, + (t) => ({ + pk: primaryKey(t.userId, t.groupId), + }), +); +export const usersToGroupsConfig = relations(usersToGroupsTable, ({ one }) => ({ + group: one(groupsTable, { + fields: [usersToGroupsTable.groupId], + references: [groupsTable.id], + }), + user: one(usersTable, { + fields: [usersToGroupsTable.userId], + references: [usersTable.id], + }), +})); + +export const postsTable = singlestoreTable('posts', { + id: serial('id').primaryKey(), + content: text('content').notNull(), + ownerId: bigint('owner_id', { mode: 'number' }), + createdAt: timestamp('created_at') + .notNull() + .defaultNow(), +}); +export const postsConfig = relations(postsTable, ({ one, many }) => ({ + author: one(usersTable, { + fields: [postsTable.ownerId], + references: [usersTable.id], + }), + comments: many(commentsTable), +})); + +export const commentsTable = singlestoreTable('comments', { + id: serial('id').primaryKey(), + content: text('content').notNull(), + creator: bigint('creator', { mode: 'number' }), + postId: bigint('post_id', { mode: 'number' }), + createdAt: timestamp('created_at') + .notNull() + .defaultNow(), +}); +export const commentsConfig = relations(commentsTable, ({ one, many }) => ({ + post: one(postsTable, { + fields: [commentsTable.postId], + references: [postsTable.id], + }), + author: one(usersTable, { + fields: [commentsTable.creator], + references: [usersTable.id], + }), + likes: many(commentLikesTable), +})); + +export const commentLikesTable = singlestoreTable('comment_likes', { + id: serial('id').primaryKey(), + creator: bigint('creator', { mode: 'number' }), + commentId: bigint('comment_id', { mode: 'number' }), + createdAt: timestamp('created_at') + .notNull() + .defaultNow(), +}); +export const commentLikesConfig = relations(commentLikesTable, ({ one }) => ({ + comment: one(commentsTable, { + fields: [commentLikesTable.commentId], + references: [commentsTable.id], + }), + author: one(usersTable, { + fields: [commentLikesTable.creator], + references: [usersTable.id], + }), +})); diff --git a/integration-tests/tests/relational/singlestore.test.ts b/integration-tests/tests/relational/singlestore.test.ts new file mode 100644 index 000000000..50aa2e8f4 --- /dev/null +++ b/integration-tests/tests/relational/singlestore.test.ts @@ -0,0 +1,6402 @@ +import retry from 'async-retry'; +import Docker from 'dockerode'; +import 'dotenv/config'; +import { desc, DrizzleError, eq, gt, gte, or, placeholder, sql, TransactionRollbackError } from 'drizzle-orm'; +import { drizzle, type SingleStoreDriverDatabase } from 'drizzle-orm/singlestore'; +import getPort from 'get-port'; +import * as mysql from 'mysql2/promise'; +import { v4 as uuid } from 'uuid'; +import { afterAll, beforeAll, beforeEach, expect, expectTypeOf, test } from 'vitest'; +import * as schema from './singlestore.schema.ts'; + +const { usersTable, postsTable, commentsTable, usersToGroupsTable, groupsTable } = schema; + +const ENABLE_LOGGING = false; + +/* + Test cases: + - querying nested relation without PK with additional fields +*/ + +declare module 'vitest' { + export interface TestContext { + docker: Docker; + singlestoreContainer: Docker.Container; + singlestoreDb: SingleStoreDriverDatabase; + singlestoreClient: mysql.Connection; + } +} + +let globalDocker: Docker; +let singlestoreContainer: Docker.Container; +let db: SingleStoreDriverDatabase; +let client: mysql.Connection; + +async function createDockerDB(): Promise { + const docker = new Docker(); + const port = await getPort({ port: 3306 }); + const image = 'ghcr.io/singlestore-labs/singlestoredb-dev:latest'; + + const pullStream = await docker.pull(image); + await new Promise((resolve, reject) => + docker.modem.followProgress(pullStream, (err) => (err ? reject(err) : resolve(err))) + ); + + singlestoreContainer = await docker.createContainer({ + Image: image, + Env: ['ROOT_PASSWORD=singlestore'], + name: `drizzle-integration-tests-${uuid()}`, + HostConfig: { + AutoRemove: true, + PortBindings: { + '3306/tcp': [{ HostPort: `${port}` }], + }, + }, + }); + + await singlestoreContainer.start(); + await new Promise((resolve) => setTimeout(resolve, 4000)); + + return `singlestore://root:singlestore@localhost:${port}/`; +} + +beforeAll(async () => { + const connectionString = process.env['SINGLESTORE_CONNECTION_STRING'] ?? (await createDockerDB()); + client = await retry(async () => { + client = await mysql.createConnection(connectionString); + await client.connect(); + return client; + }, { + retries: 20, + factor: 1, + minTimeout: 250, + maxTimeout: 250, + randomize: false, + onRetry() { + client?.end(); + }, + }); + + await client.query(`CREATE DATABASE IF NOT EXISTS drizzle;`); + await client.changeUser({ database: 'drizzle' }); + db = drizzle(client, { schema, logger: ENABLE_LOGGING }); +}); + +afterAll(async () => { + await client?.end().catch(console.error); + await singlestoreContainer?.stop().catch(console.error); +}); + +beforeEach(async (ctx) => { + ctx.singlestoreDb = db; + ctx.singlestoreClient = client; + ctx.docker = globalDocker; + ctx.singlestoreContainer = singlestoreContainer; + + await ctx.singlestoreDb.execute(sql`drop table if exists \`users\``); + await ctx.singlestoreDb.execute(sql`drop table if exists \`groups\``); + await ctx.singlestoreDb.execute(sql`drop table if exists \`users_to_groups\``); + await ctx.singlestoreDb.execute(sql`drop table if exists \`posts\``); + await ctx.singlestoreDb.execute(sql`drop table if exists \`comments\``); + await ctx.singlestoreDb.execute(sql`drop table if exists \`comment_likes\``); + + await ctx.singlestoreDb.execute( + sql` + CREATE TABLE \`users\` ( + \`id\` serial PRIMARY KEY NOT NULL, + \`name\` text NOT NULL, + \`verified\` boolean DEFAULT false NOT NULL, + \`invited_by\` bigint + ); + `, + ); + await ctx.singlestoreDb.execute( + sql` + CREATE TABLE \`groups\` ( + \`id\` serial PRIMARY KEY NOT NULL, + \`name\` text NOT NULL, + \`description\` text + ); + `, + ); + await ctx.singlestoreDb.execute( + sql` + CREATE TABLE \`users_to_groups\` ( + \`id\` serial PRIMARY KEY NOT NULL, + \`user_id\` bigint, + \`group_id\` bigint + ); + `, + ); + await ctx.singlestoreDb.execute( + sql` + CREATE TABLE \`posts\` ( + \`id\` serial PRIMARY KEY NOT NULL, + \`content\` text NOT NULL, + \`owner_id\` bigint, + \`created_at\` timestamp DEFAULT CURRENT_TIMESTAMP NOT NULL + ); + `, + ); + await ctx.singlestoreDb.execute( + sql` + CREATE TABLE \`comments\` ( + \`id\` serial PRIMARY KEY NOT NULL, + \`content\` text NOT NULL, + \`creator\` bigint, + \`post_id\` bigint, + \`created_at\` timestamp DEFAULT CURRENT_TIMESTAMP NOT NULL + ); + `, + ); + await ctx.singlestoreDb.execute( + sql` + CREATE TABLE \`comment_likes\` ( + \`id\` serial PRIMARY KEY NOT NULL, + \`creator\` bigint, + \`comment_id\` bigint, + \`created_at\` timestamp DEFAULT CURRENT_TIMESTAMP NOT NULL + ); + `, + ); +}); + +/* + [Find Many] One relation users+posts +*/ + +test('[Find Many] Get users with posts', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + with: { + posts: true, + }, + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + usersWithPosts.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(usersWithPosts.length).eq(3); + expect(usersWithPosts[0]?.posts.length).eq(1); + expect(usersWithPosts[1]?.posts.length).eq(1); + expect(usersWithPosts[2]?.posts.length).eq(1); + + expect(usersWithPosts[0]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts[0]?.posts[0]?.createdAt }], + }); + expect(usersWithPosts[1]).toEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + posts: [{ id: 2, ownerId: 2, content: 'Post2', createdAt: usersWithPosts[1]?.posts[0]?.createdAt }], + }); + expect(usersWithPosts[2]).toEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + posts: [{ id: 3, ownerId: 3, content: 'Post3', createdAt: usersWithPosts[2]?.posts[0]?.createdAt }], + }); +}); + +test.skip('[Find Many] Get users with posts + limit posts', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + with: { + posts: { + limit: 1, + }, + }, + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + usersWithPosts.sort((a, b) => (a.id > b.id) ? 1 : -1); + usersWithPosts[0]?.posts.sort((a, b) => (a.id > b.id) ? 1 : -1); + usersWithPosts[1]?.posts.sort((a, b) => (a.id > b.id) ? 1 : -1); + usersWithPosts[2]?.posts.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(usersWithPosts.length).eq(3); + expect(usersWithPosts[0]?.posts.length).eq(1); + expect(usersWithPosts[1]?.posts.length).eq(1); + expect(usersWithPosts[2]?.posts.length).eq(1); + + expect(usersWithPosts[0]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts[0]?.posts[0]?.createdAt }], + }); + expect(usersWithPosts[1]).toEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + posts: [{ id: 4, ownerId: 2, content: 'Post2', createdAt: usersWithPosts[1]?.posts[0]?.createdAt }], + }); + expect(usersWithPosts[2]).toEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + posts: [{ id: 6, ownerId: 3, content: 'Post3', createdAt: usersWithPosts[2]?.posts[0]?.createdAt }], + }); +}); + +test.skip('[Find Many] Get users with posts + limit posts and users', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + limit: 2, + with: { + posts: { + limit: 1, + }, + }, + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + usersWithPosts.sort((a, b) => (a.id > b.id) ? 1 : -1); + usersWithPosts[0]?.posts.sort((a, b) => (a.id > b.id) ? 1 : -1); + usersWithPosts[1]?.posts.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(usersWithPosts.length).eq(2); + expect(usersWithPosts[0]?.posts.length).eq(1); + expect(usersWithPosts[1]?.posts.length).eq(1); + + expect(usersWithPosts[0]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts[0]?.posts[0]?.createdAt }], + }); + expect(usersWithPosts[1]).toEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + posts: [{ id: 4, ownerId: 2, content: 'Post2', createdAt: usersWithPosts[1]?.posts[0]?.createdAt }], + }); +}); + +test('[Find Many] Get users with posts + custom fields', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + with: { + posts: true, + }, + extras: ({ name }) => ({ + lowerName: sql`lower(${name})`.as('name_lower'), + }), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + lowerName: string; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + usersWithPosts.sort((a, b) => (a.id > b.id) ? 1 : -1); + usersWithPosts[0]?.posts.sort((a, b) => (a.id > b.id) ? 1 : -1); + usersWithPosts[1]?.posts.sort((a, b) => (a.id > b.id) ? 1 : -1); + usersWithPosts[2]?.posts.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(usersWithPosts.length).toEqual(3); + expect(usersWithPosts[0]?.posts.length).toEqual(3); + expect(usersWithPosts[1]?.posts.length).toEqual(2); + expect(usersWithPosts[2]?.posts.length).toEqual(2); + + expect(usersWithPosts[0]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + lowerName: 'dan', + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts[0]?.posts[0]?.createdAt }, { + id: 2, + ownerId: 1, + content: 'Post1.2', + createdAt: usersWithPosts[0]?.posts[1]?.createdAt, + }, { id: 3, ownerId: 1, content: 'Post1.3', createdAt: usersWithPosts[0]?.posts[2]?.createdAt }], + }); + expect(usersWithPosts[1]).toEqual({ + id: 2, + name: 'Andrew', + lowerName: 'andrew', + verified: false, + invitedBy: null, + posts: [{ id: 4, ownerId: 2, content: 'Post2', createdAt: usersWithPosts[1]?.posts[0]?.createdAt }, { + id: 5, + ownerId: 2, + content: 'Post2.1', + createdAt: usersWithPosts[1]?.posts[1]?.createdAt, + }], + }); + expect(usersWithPosts[2]).toEqual({ + id: 3, + name: 'Alex', + lowerName: 'alex', + verified: false, + invitedBy: null, + posts: [{ id: 6, ownerId: 3, content: 'Post3', createdAt: usersWithPosts[2]?.posts[0]?.createdAt }, { + id: 7, + ownerId: 3, + content: 'Post3.1', + createdAt: usersWithPosts[2]?.posts[1]?.createdAt, + }], + }); +}); + +test.skip('[Find Many] Get users with posts + custom fields + limits', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + limit: 1, + with: { + posts: { + limit: 1, + }, + }, + extras: (usersTable, { sql }) => ({ + lowerName: sql`lower(${usersTable.name})`.as('name_lower'), + }), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + lowerName: string; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + expect(usersWithPosts.length).toEqual(1); + expect(usersWithPosts[0]?.posts.length).toEqual(1); + + expect(usersWithPosts[0]).toEqual({ + id: 1, + name: 'Dan', + lowerName: 'dan', + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts[0]?.posts[0]?.createdAt }], + }); +}); + +test.skip('[Find Many] Get users with posts + orderBy', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: '1' }, + { ownerId: 1, content: '2' }, + { ownerId: 1, content: '3' }, + { ownerId: 2, content: '4' }, + { ownerId: 2, content: '5' }, + { ownerId: 3, content: '6' }, + { ownerId: 3, content: '7' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + with: { + posts: { + orderBy: (postsTable, { desc }) => [desc(postsTable.content)], + }, + }, + orderBy: (usersTable, { desc }) => [desc(usersTable.id)], + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + expect(usersWithPosts.length).eq(3); + expect(usersWithPosts[0]?.posts.length).eq(2); + expect(usersWithPosts[1]?.posts.length).eq(2); + expect(usersWithPosts[2]?.posts.length).eq(3); + + expect(usersWithPosts[2]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ id: 3, ownerId: 1, content: '3', createdAt: usersWithPosts[2]?.posts[2]?.createdAt }, { + id: 2, + ownerId: 1, + content: '2', + createdAt: usersWithPosts[2]?.posts[1]?.createdAt, + }, { id: 1, ownerId: 1, content: '1', createdAt: usersWithPosts[2]?.posts[0]?.createdAt }], + }); + expect(usersWithPosts[1]).toEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + posts: [{ + id: 5, + ownerId: 2, + content: '5', + createdAt: usersWithPosts[1]?.posts[1]?.createdAt, + }, { id: 4, ownerId: 2, content: '4', createdAt: usersWithPosts[1]?.posts[0]?.createdAt }], + }); + expect(usersWithPosts[0]).toEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + posts: [{ + id: 7, + ownerId: 3, + content: '7', + createdAt: usersWithPosts[0]?.posts[1]?.createdAt, + }, { id: 6, ownerId: 3, content: '6', createdAt: usersWithPosts[0]?.posts[0]?.createdAt }], + }); +}); + +test('[Find Many] Get users with posts + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + where: (({ id }, { eq }) => eq(id, 1)), + with: { + posts: { + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + expect(usersWithPosts.length).eq(1); + expect(usersWithPosts[0]?.posts.length).eq(1); + + expect(usersWithPosts[0]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts[0]?.posts[0]?.createdAt }], + }); +}); + +test('[Find Many] Get users with posts + where + partial', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + columns: { + id: true, + name: true, + }, + with: { + posts: { + columns: { + id: true, + content: true, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + posts: { + id: number; + content: string; + }[]; + }[]>(); + + expect(usersWithPosts.length).eq(1); + expect(usersWithPosts[0]?.posts.length).eq(1); + + expect(usersWithPosts[0]).toEqual({ + id: 1, + name: 'Dan', + posts: [{ id: 1, content: 'Post1' }], + }); +}); + +test('[Find Many] Get users with posts + where + partial. Did not select posts id, but used it in where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + columns: { + id: true, + name: true, + }, + with: { + posts: { + columns: { + id: true, + content: true, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + posts: { + id: number; + content: string; + }[]; + }[]>(); + + expect(usersWithPosts.length).eq(1); + expect(usersWithPosts[0]?.posts.length).eq(1); + + expect(usersWithPosts[0]).toEqual({ + id: 1, + name: 'Dan', + posts: [{ id: 1, content: 'Post1' }], + }); +}); + +test('[Find Many] Get users with posts + where + partial(true + false)', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + columns: { + id: true, + name: false, + }, + with: { + posts: { + columns: { + id: true, + content: false, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + posts: { + id: number; + }[]; + }[]>(); + + expect(usersWithPosts.length).eq(1); + expect(usersWithPosts[0]?.posts.length).eq(1); + + expect(usersWithPosts[0]).toEqual({ + id: 1, + posts: [{ id: 1 }], + }); +}); + +test('[Find Many] Get users with posts + where + partial(false)', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + columns: { + name: false, + }, + with: { + posts: { + columns: { + content: false, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + expect(usersWithPosts.length).eq(1); + expect(usersWithPosts[0]?.posts.length).eq(1); + + expect(usersWithPosts[0]).toEqual({ + id: 1, + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, createdAt: usersWithPosts[0]?.posts[0]?.createdAt }], + }); +}); + +test('[Find Many] Get users with posts in transaction', async (t) => { + const { singlestoreDb: db } = t; + + let usersWithPosts: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[] = []; + + await db.transaction(async (tx) => { + await tx.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await tx.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + usersWithPosts = await tx.query.usersTable.findMany({ + where: (({ id }, { eq }) => eq(id, 1)), + with: { + posts: { + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + }); + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + expect(usersWithPosts.length).eq(1); + expect(usersWithPosts[0]?.posts.length).eq(1); + + expect(usersWithPosts[0]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts[0]?.posts[0]?.createdAt }], + }); +}); + +test('[Find Many] Get users with posts in rollbacked transaction', async (t) => { + const { singlestoreDb: db } = t; + + let usersWithPosts: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[] = []; + + await expect(db.transaction(async (tx) => { + await tx.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await tx.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + tx.rollback(); + + usersWithPosts = await tx.query.usersTable.findMany({ + where: (({ id }, { eq }) => eq(id, 1)), + with: { + posts: { + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + }); + })).rejects.toThrowError(new TransactionRollbackError()); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + expect(usersWithPosts.length).eq(0); +}); + +// select only custom +test('[Find Many] Get only custom fields', async () => { + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { id: 1, ownerId: 1, content: 'Post1' }, + { id: 2, ownerId: 1, content: 'Post1.2' }, + { id: 3, ownerId: 1, content: 'Post1.3' }, + { id: 4, ownerId: 2, content: 'Post2' }, + { id: 5, ownerId: 2, content: 'Post2.1' }, + { id: 6, ownerId: 3, content: 'Post3' }, + { id: 7, ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + columns: {}, + with: { + posts: { + columns: {}, + extras: ({ content }) => ({ + lowerName: sql`lower(${content})`.as('content_lower'), + }), + }, + }, + extras: ({ name }) => ({ + lowerName: sql`lower(${name})`.as('name_lower'), + }), + }); + + // Type Assertion + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + lowerName: string; + posts: { + lowerName: string; + }[]; + }[]>(); + + // General Assertions + expect(usersWithPosts).toHaveLength(3); + + // Helper function to find user by lowerName + const findUser = (lowerName: string) => usersWithPosts.find((user) => user.lowerName === lowerName); + + // Assertions for each user + const dan = findUser('dan'); + const andrew = findUser('andrew'); + const alex = findUser('alex'); + + expect(dan).toBeDefined(); + expect(andrew).toBeDefined(); + expect(alex).toBeDefined(); + + // Verify the number of posts for each user + expect(dan?.posts).toHaveLength(3); + expect(andrew?.posts).toHaveLength(2); + expect(alex?.posts).toHaveLength(2); + + // Define expected posts for each user + const expectedDanPosts = ['post1', 'post1.2', 'post1.3']; + const expectedAndrewPosts = ['post2', 'post2.1']; + const expectedAlexPosts = ['post3', 'post3.1']; + + // Helper function to extract lowerNames from posts + const getPostLowerNames = (posts: { lowerName: string }[]) => posts.map((post) => post.lowerName); + + // Assertions for Dan's posts + expect(getPostLowerNames(dan!.posts)).toEqual(expect.arrayContaining(expectedDanPosts)); + expect(getPostLowerNames(dan!.posts)).toHaveLength(expectedDanPosts.length); + + // Assertions for Andrew's posts + expect(getPostLowerNames(andrew!.posts)).toEqual(expect.arrayContaining(expectedAndrewPosts)); + expect(getPostLowerNames(andrew!.posts)).toHaveLength(expectedAndrewPosts.length); + + // Assertions for Alex's posts + expect(getPostLowerNames(alex!.posts)).toEqual(expect.arrayContaining(expectedAlexPosts)); + expect(getPostLowerNames(alex!.posts)).toHaveLength(expectedAlexPosts.length); +}); + +// select only custom with where clause (Order Agnostic) +test('[Find Many] Get only custom fields + where', async (t) => { + const { singlestoreDb: db } = t; + + // Insert Users + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + // Insert Posts + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + // Query Users with Posts where users.id = 1 and posts.id >= 2 + const usersWithPosts = await db.query.usersTable.findMany({ + columns: {}, + with: { + posts: { + columns: {}, + where: gte(postsTable.id, 2), + extras: ({ content }) => ({ + lowerName: sql`lower(${content})`.as('content_lower'), + }), + }, + }, + where: eq(usersTable.id, 1), + extras: ({ name }) => ({ + lowerName: sql`lower(${name})`.as('name_lower'), + }), + }); + + // Type Assertion + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + lowerName: string; + posts: { + lowerName: string; + }[]; + }[]>(); + + // General Assertions + expect(usersWithPosts).toHaveLength(1); + + // Since we expect only one user, we can extract it directly + const danWithPosts = usersWithPosts[0]; + + // Assert that the user exists and has the correct lowerName + expect(danWithPosts).toBeDefined(); + expect(danWithPosts?.lowerName).toBe('dan'); + + // Assert that the user has the expected number of posts + expect(danWithPosts?.posts).toHaveLength(2); + + // Define the expected posts + const expectedPosts = ['post1.2', 'post1.3']; + + // Extract the lowerName of each post + const actualPostLowerNames = danWithPosts?.posts.map((post) => post.lowerName); + + // Assert that all expected posts are present, regardless of order + for (const expectedPost of expectedPosts) { + expect(actualPostLowerNames).toContain(expectedPost); + } + + // Additionally, ensure no unexpected posts are present + expect(actualPostLowerNames).toHaveLength(expectedPosts.length); +}); + +test.skip('[Find Many] Get only custom fields + where + limit', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + columns: {}, + with: { + posts: { + columns: {}, + where: gte(postsTable.id, 2), + limit: 1, + extras: ({ content }) => ({ + lowerName: sql`lower(${content})`.as('content_lower'), + }), + }, + }, + where: eq(usersTable.id, 1), + extras: ({ name }) => ({ + lowerName: sql`lower(${name})`.as('name_lower'), + }), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + lowerName: string; + posts: { + lowerName: string; + }[]; + }[]>(); + + expect(usersWithPosts.length).toEqual(1); + expect(usersWithPosts[0]?.posts.length).toEqual(1); + + expect(usersWithPosts).toContainEqual({ + lowerName: 'dan', + posts: [{ lowerName: 'post1.2' }], + }); +}); + +test.skip('[Find Many] Get only custom fields + where + orderBy', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findMany({ + columns: {}, + with: { + posts: { + columns: {}, + where: gte(postsTable.id, 2), + orderBy: [desc(postsTable.id)], + extras: ({ content }) => ({ + lowerName: sql`lower(${content})`.as('content_lower'), + }), + }, + }, + where: eq(usersTable.id, 1), + extras: ({ name }) => ({ + lowerName: sql`lower(${name})`.as('name_lower'), + }), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + lowerName: string; + posts: { + lowerName: string; + }[]; + }[]>(); + + expect(usersWithPosts.length).toEqual(1); + expect(usersWithPosts[0]?.posts.length).toEqual(2); + + expect(usersWithPosts).toContainEqual({ + lowerName: 'dan', + posts: [{ lowerName: 'post1.3' }, { lowerName: 'post1.2' }], + }); +}); + +// select only custom find one (Order Agnostic) +test('[Find One] Get only custom fields (Order Agnostic)', async () => { + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + // Query to find the first user without any specific order + const usersWithPosts = await db.query.usersTable.findFirst({ + columns: {}, + with: { + posts: { + columns: {}, + extras: ({ content }) => ({ + lowerName: sql`lower(${content})`.as('content_lower'), + }), + }, + }, + extras: ({ name }) => ({ + lowerName: sql`lower(${name})`.as('name_lower'), + }), + }); + + // Type Assertion + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + lowerName: string; + posts: { + lowerName: string; + }[]; + } | undefined + >(); + + // General Assertions + expect(usersWithPosts).toBeDefined(); + + // Since findFirst without orderBy can return any user, we'll verify the returned user and their posts + if (usersWithPosts) { + // Define expected users and their corresponding posts + const expectedUsers: { [key: string]: string[] } = { + dan: ['post1', 'post1.2', 'post1.3'], + andrew: ['post2', 'post2.1'], + alex: ['post3', 'post3.1'], + }; + + // Verify that the returned user is one of the expected users + expect(Object.keys(expectedUsers)).toContain(usersWithPosts.lowerName); + + // Get the expected posts for the returned user + const expectedPosts = expectedUsers[usersWithPosts.lowerName] as string[]; + + // Verify the number of posts + expect(usersWithPosts.posts).toHaveLength(expectedPosts.length); + + // Extract the lowerName of each post + const actualPostLowerNames = usersWithPosts.posts.map((post) => post.lowerName); + + // Assert that all expected posts are present, regardless of order + for (const expectedPost of expectedPosts) { + expect(actualPostLowerNames).toContain(expectedPost.toLowerCase()); + } + } +}); + +// select only custom find one with where clause (Order Agnostic) +test('[Find One] Get only custom fields + where (Order Agnostic)', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + // Query to find the first user with id = 1 and posts with id >= 2 + const usersWithPosts = await db.query.usersTable.findFirst({ + columns: {}, + with: { + posts: { + columns: {}, + where: gte(postsTable.id, 2), + extras: ({ content }) => ({ + lowerName: sql`lower(${content})`.as('content_lower'), + }), + }, + }, + where: eq(usersTable.id, 1), + extras: ({ name }) => ({ + lowerName: sql`lower(${name})`.as('name_lower'), + }), + }); + + // Type Assertion + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + lowerName: string; + posts: { + lowerName: string; + }[]; + } | undefined + >(); + + // General Assertions + expect(usersWithPosts).toBeDefined(); + + if (usersWithPosts) { + // Assert that the returned user has the expected lowerName + expect(usersWithPosts.lowerName).toBe('dan'); + + // Assert that the user has exactly two posts + expect(usersWithPosts.posts).toHaveLength(2); + + // Define the expected posts + const expectedPosts = ['post1.2', 'post1.3']; + + // Extract the lowerName of each post + const actualPostLowerNames = usersWithPosts.posts.map((post) => post.lowerName); + + // Assert that all expected posts are present, regardless of order + for (const expectedPost of expectedPosts) { + expect(actualPostLowerNames).toContain(expectedPost.toLowerCase()); + } + + // Additionally, ensure no unexpected posts are present + expect(actualPostLowerNames).toHaveLength(expectedPosts.length); + } +}); + +test.skip('[Find One] Get only custom fields + where + limit', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + columns: {}, + with: { + posts: { + columns: {}, + where: gte(postsTable.id, 2), + limit: 1, + extras: ({ content }) => ({ + lowerName: sql`lower(${content})`.as('content_lower'), + }), + }, + }, + where: eq(usersTable.id, 1), + extras: ({ name }) => ({ + lowerName: sql`lower(${name})`.as('name_lower'), + }), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + lowerName: string; + posts: { + lowerName: string; + }[]; + } | undefined + >(); + + expect(usersWithPosts?.posts.length).toEqual(1); + + expect(usersWithPosts).toEqual({ + lowerName: 'dan', + posts: [{ lowerName: 'post1.2' }], + }); +}); + +test.skip('[Find One] Get only custom fields + where + orderBy', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + columns: {}, + with: { + posts: { + columns: {}, + where: gte(postsTable.id, 2), + orderBy: [desc(postsTable.id)], + extras: ({ content }) => ({ + lowerName: sql`lower(${content})`.as('content_lower'), + }), + }, + }, + where: eq(usersTable.id, 1), + extras: ({ name }) => ({ + lowerName: sql`lower(${name})`.as('name_lower'), + }), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + lowerName: string; + posts: { + lowerName: string; + }[]; + } | undefined + >(); + + expect(usersWithPosts?.posts.length).toEqual(2); + + expect(usersWithPosts).toEqual({ + lowerName: 'dan', + posts: [{ lowerName: 'post1.3' }, { lowerName: 'post1.2' }], + }); +}); + +// columns {} +test('[Find Many] Get select {}', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await expect( + async () => + await db.query.usersTable.findMany({ + columns: {}, + }), + ).rejects.toThrow(DrizzleError); +}); + +// columns {} +test('[Find One] Get select {}', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await expect(async () => + await db.query.usersTable.findFirst({ + columns: {}, + }) + ).rejects.toThrow(DrizzleError); +}); + +// deep select {} +test('[Find Many] Get deep select {}', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + await expect(async () => + await db.query.usersTable.findMany({ + columns: {}, + with: { + posts: { + columns: {}, + }, + }, + }) + ).rejects.toThrow(DrizzleError); +}); + +// deep select {} +test('[Find One] Get deep select {}', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + await expect(async () => + await db.query.usersTable.findFirst({ + columns: {}, + with: { + posts: { + columns: {}, + }, + }, + }) + ).rejects.toThrow(DrizzleError); +}); + +/* + Prepared statements for users+posts +*/ +test.skip('[Find Many] Get users with posts + prepared limit', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const prepared = db.query.usersTable.findMany({ + with: { + posts: { + limit: placeholder('limit'), + }, + }, + }).prepare(); + + const usersWithPosts = await prepared.execute({ limit: 1 }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + expect(usersWithPosts.length).eq(3); + expect(usersWithPosts[0]?.posts.length).eq(1); + expect(usersWithPosts[1]?.posts.length).eq(1); + expect(usersWithPosts[2]?.posts.length).eq(1); + + expect(usersWithPosts).toContainEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts[0]?.posts[0]?.createdAt }], + }); + expect(usersWithPosts).toContainEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + posts: [{ id: 4, ownerId: 2, content: 'Post2', createdAt: usersWithPosts[1]?.posts[0]?.createdAt }], + }); + expect(usersWithPosts).toContainEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + posts: [{ id: 6, ownerId: 3, content: 'Post3', createdAt: usersWithPosts[2]?.posts[0]?.createdAt }], + }); +}); + +test.skip('[Find Many] Get users with posts + prepared limit + offset', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const prepared = db.query.usersTable.findMany({ + limit: placeholder('uLimit'), + offset: placeholder('uOffset'), + with: { + posts: { + limit: placeholder('pLimit'), + }, + }, + }).prepare(); + + const usersWithPosts = await prepared.execute({ pLimit: 1, uLimit: 3, uOffset: 1 }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + expect(usersWithPosts.length).eq(2); + expect(usersWithPosts[0]?.posts.length).eq(1); + expect(usersWithPosts[1]?.posts.length).eq(1); + + expect(usersWithPosts).toContainEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + posts: [{ id: 4, ownerId: 2, content: 'Post2', createdAt: usersWithPosts[0]?.posts[0]?.createdAt }], + }); + expect(usersWithPosts).toContainEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + posts: [{ id: 6, ownerId: 3, content: 'Post3', createdAt: usersWithPosts[1]?.posts[0]?.createdAt }], + }); +}); + +test('[Find Many] Get users with posts + prepared where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const prepared = db.query.usersTable.findMany({ + where: (({ id }, { eq }) => eq(id, placeholder('id'))), + with: { + posts: { + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + }).prepare(); + + const usersWithPosts = await prepared.execute({ id: 1 }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + expect(usersWithPosts.length).eq(1); + expect(usersWithPosts[0]?.posts.length).eq(1); + + expect(usersWithPosts[0]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts[0]?.posts[0]?.createdAt }], + }); +}); + +test.skip('[Find Many] Get users with posts + prepared + limit + offset + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const prepared = db.query.usersTable.findMany({ + limit: placeholder('uLimit'), + offset: placeholder('uOffset'), + where: (({ id }, { eq, or }) => or(eq(id, placeholder('id')), eq(id, 3))), + with: { + posts: { + where: (({ id }, { eq }) => eq(id, placeholder('pid'))), + limit: placeholder('pLimit'), + }, + }, + }).prepare(); + + const usersWithPosts = await prepared.execute({ pLimit: 1, uLimit: 3, uOffset: 1, id: 2, pid: 6 }); + + expectTypeOf(usersWithPosts).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + }[]>(); + + expect(usersWithPosts.length).eq(1); + expect(usersWithPosts[0]?.posts.length).eq(1); + + expect(usersWithPosts).toContainEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + posts: [{ id: 6, ownerId: 3, content: 'Post3', createdAt: usersWithPosts[0]?.posts[0]?.createdAt }], + }); +}); + +/* + [Find One] One relation users+posts +*/ + +test('[Find One] Get users with posts', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + with: { + posts: true, + }, + }); + + // Type Assertion + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + } | undefined + >(); + + // General Assertions + expect(usersWithPosts).toBeDefined(); + + if (usersWithPosts) { + const { id, name, posts } = usersWithPosts; + + // Verify that the user is one of the inserted users + const validUsers: { [key: number]: string } = { + 1: 'dan', + 2: 'andrew', + 3: 'alex', + }; + expect(validUsers[id]).toBe(name.toLowerCase()); + + // Assert that the user has exactly one post + expect(posts).toHaveLength(1); + + const post = posts[0]; + + // Verify that the post belongs to the user + expect(post?.ownerId).toBe(id); + + // Verify that the post content matches the user + const expectedPostContent = `Post${id}`; + expect(post?.content.toLowerCase()).toBe(expectedPostContent.toLowerCase()); + + // Optionally, verify the presence of `createdAt` + expect(post?.createdAt).toBeInstanceOf(Date); + } +}); + +test.skip('[Find One] Get users with posts + limit posts', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + with: { + posts: { + limit: 1, + }, + }, + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + } | undefined + >(); + + expect(usersWithPosts!.posts.length).eq(1); + + expect(usersWithPosts).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts?.posts[0]?.createdAt }], + }); +}); + +test.skip('[Find One] Get users with posts no results found', async (t) => { + const { singlestoreDb: db } = t; + + const usersWithPosts = await db.query.usersTable.findFirst({ + with: { + posts: { + limit: 1, + }, + }, + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + } | undefined + >(); + + expect(usersWithPosts).toBeUndefined(); +}); + +test.skip('[Find One] Get users with posts + limit posts and users', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + with: { + posts: { + limit: 1, + }, + }, + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + } | undefined + >(); + + expect(usersWithPosts!.posts.length).eq(1); + + expect(usersWithPosts).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts?.posts[0]?.createdAt }], + }); +}); + +test('[Find One] Get users with posts + custom fields', async () => { + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + with: { + posts: true, + }, + extras: ({ name }) => ({ + lowerName: sql`lower(${name})`.as('name_lower'), + }), + }); + + // Type Assertion + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + lowerName: string; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + } | undefined + >(); + + // General Assertions + expect(usersWithPosts).toBeDefined(); + + if (usersWithPosts) { + const { id, lowerName, posts } = usersWithPosts; + + // Define valid users and their expected lower names + const validUsers: { [key: number]: string } = { + 1: 'dan', + 2: 'andrew', + 3: 'alex', + }; + + // Verify that the returned user's lowerName matches the expected value + expect(validUsers[id]).toBe(lowerName); + + // Define the expected posts based on the user ID + const expectedPostsByUser: Record = { + 1: ['post1', 'post1.2', 'post1.3'], + 2: ['post2', 'post2.1'], + 3: ['post3', 'post3.1'], + }; + + // Get the expected posts for the returned user + const expectedPosts = expectedPostsByUser[id] || []; + + // Extract the lowerName of each post + const actualPostContents = posts.map((post) => post.content.toLowerCase()); + + // Assert that all expected posts are present, regardless of order + for (const expectedPost of expectedPosts) { + expect(actualPostContents).toContain(expectedPost.toLowerCase()); + } + + // Optionally, ensure that no unexpected posts are present + expect(actualPostContents).toHaveLength(expectedPosts.length); + } +}); + +test.skip('[Find One] Get users with posts + custom fields + limits', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.2' }, + { ownerId: 1, content: 'Post1.3' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + with: { + posts: { + limit: 1, + }, + }, + extras: (usersTable, { sql }) => ({ + lowerName: sql`lower(${usersTable.name})`.as('name_lower'), + }), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + lowerName: string; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + } | undefined + >(); + + expect(usersWithPosts!.posts.length).toEqual(1); + + expect(usersWithPosts).toEqual({ + id: 1, + name: 'Dan', + lowerName: 'dan', + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts?.posts[0]?.createdAt }], + }); +}); + +test.skip('[Find One] Get users with posts + orderBy', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: '1' }, + { ownerId: 1, content: '2' }, + { ownerId: 1, content: '3' }, + { ownerId: 2, content: '4' }, + { ownerId: 2, content: '5' }, + { ownerId: 3, content: '6' }, + { ownerId: 3, content: '7' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + with: { + posts: { + orderBy: (postsTable, { desc }) => [desc(postsTable.content)], + }, + }, + orderBy: (usersTable, { desc }) => [desc(usersTable.id)], + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + } | undefined + >(); + + expect(usersWithPosts!.posts.length).eq(2); + + expect(usersWithPosts).toEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + posts: [{ + id: 7, + ownerId: 3, + content: '7', + createdAt: usersWithPosts?.posts[1]?.createdAt, + }, { id: 6, ownerId: 3, content: '6', createdAt: usersWithPosts?.posts[0]?.createdAt }], + }); +}); + +test('[Find One] Get users with posts + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + where: (({ id }, { eq }) => eq(id, 1)), + with: { + posts: { + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + }[]; + } | undefined + >(); + + expect(usersWithPosts!.posts.length).eq(1); + + expect(usersWithPosts).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: usersWithPosts?.posts[0]?.createdAt }], + }); +}); + +test('[Find One] Get users with posts + where + partial', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + columns: { + id: true, + name: true, + }, + with: { + posts: { + columns: { + id: true, + content: true, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + id: number; + name: string; + posts: { + id: number; + content: string; + }[]; + } | undefined + >(); + + expect(usersWithPosts!.posts.length).eq(1); + + expect(usersWithPosts).toEqual({ + id: 1, + name: 'Dan', + posts: [{ id: 1, content: 'Post1' }], + }); +}); + +test('[Find One] Get users with posts + where + partial. Did not select posts id, but used it in where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + columns: { + id: true, + name: true, + }, + with: { + posts: { + columns: { + id: true, + content: true, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + id: number; + name: string; + posts: { + id: number; + content: string; + }[]; + } | undefined + >(); + + expect(usersWithPosts!.posts.length).eq(1); + + expect(usersWithPosts).toEqual({ + id: 1, + name: 'Dan', + posts: [{ id: 1, content: 'Post1' }], + }); +}); + +test('[Find One] Get users with posts + where + partial(true + false)', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + columns: { + id: true, + name: false, + }, + with: { + posts: { + columns: { + id: true, + content: false, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + id: number; + posts: { + id: number; + }[]; + } | undefined + >(); + + expect(usersWithPosts!.posts.length).eq(1); + + expect(usersWithPosts).toEqual({ + id: 1, + posts: [{ id: 1 }], + }); +}); + +test('[Find One] Get users with posts + where + partial(false)', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const usersWithPosts = await db.query.usersTable.findFirst({ + columns: { + name: false, + }, + with: { + posts: { + columns: { + content: false, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }, + }, + where: (({ id }, { eq }) => eq(id, 1)), + }); + + expectTypeOf(usersWithPosts).toEqualTypeOf< + { + id: number; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + ownerId: number | null; + createdAt: Date; + }[]; + } | undefined + >(); + + expect(usersWithPosts!.posts.length).eq(1); + + expect(usersWithPosts).toEqual({ + id: 1, + verified: false, + invitedBy: null, + posts: [{ id: 1, ownerId: 1, createdAt: usersWithPosts?.posts[0]?.createdAt }], + }); +}); + +/* + One relation users+users. Self referencing +*/ + +test.skip('Get user with invitee', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + const usersWithInvitee = await db.query.usersTable.findMany({ + with: { + invitee: true, + }, + }); + + expectTypeOf(usersWithInvitee).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + invitee: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + usersWithInvitee.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(usersWithInvitee.length).eq(4); + expect(usersWithInvitee[0]?.invitee).toBeNull(); + expect(usersWithInvitee[1]?.invitee).toBeNull(); + expect(usersWithInvitee[2]?.invitee).not.toBeNull(); + expect(usersWithInvitee[3]?.invitee).not.toBeNull(); + + expect(usersWithInvitee[0]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + invitee: null, + }); + expect(usersWithInvitee[1]).toEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + invitee: null, + }); + expect(usersWithInvitee[2]).toEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', verified: false, invitedBy: null }, + }); + expect(usersWithInvitee[3]).toEqual({ + id: 4, + name: 'John', + verified: false, + invitedBy: 2, + invitee: { id: 2, name: 'Andrew', verified: false, invitedBy: null }, + }); +}); + +test.skip('Get user + limit with invitee', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew', invitedBy: 1 }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + const usersWithInvitee = await db.query.usersTable.findMany({ + with: { + invitee: true, + }, + limit: 2, + }); + + expectTypeOf(usersWithInvitee).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + invitee: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + usersWithInvitee.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(usersWithInvitee.length).eq(2); + expect(usersWithInvitee[0]?.invitee).toBeNull(); + expect(usersWithInvitee[1]?.invitee).not.toBeNull(); + + expect(usersWithInvitee[0]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + invitee: null, + }); + expect(usersWithInvitee[1]).toEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', verified: false, invitedBy: null }, + }); +}); + +test.skip('Get user with invitee and custom fields', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + const usersWithInvitee = await db.query.usersTable.findMany({ + extras: (users, { sql }) => ({ lower: sql`lower(${users.name})`.as('lower_name') }), + with: { + invitee: { + extras: (invitee, { sql }) => ({ lower: sql`lower(${invitee.name})`.as('lower_name') }), + }, + }, + }); + + expectTypeOf(usersWithInvitee).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + lower: string; + invitedBy: number | null; + invitee: { + id: number; + name: string; + verified: boolean; + lower: string; + invitedBy: number | null; + } | null; + }[] + >(); + + usersWithInvitee.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(usersWithInvitee.length).eq(4); + expect(usersWithInvitee[0]?.invitee).toBeNull(); + expect(usersWithInvitee[1]?.invitee).toBeNull(); + expect(usersWithInvitee[2]?.invitee).not.toBeNull(); + expect(usersWithInvitee[3]?.invitee).not.toBeNull(); + + expect(usersWithInvitee[0]).toEqual({ + id: 1, + name: 'Dan', + lower: 'dan', + verified: false, + invitedBy: null, + invitee: null, + }); + expect(usersWithInvitee[1]).toEqual({ + id: 2, + name: 'Andrew', + lower: 'andrew', + verified: false, + invitedBy: null, + invitee: null, + }); + expect(usersWithInvitee[2]).toEqual({ + id: 3, + name: 'Alex', + lower: 'alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', lower: 'dan', verified: false, invitedBy: null }, + }); + expect(usersWithInvitee[3]).toEqual({ + id: 4, + name: 'John', + lower: 'john', + verified: false, + invitedBy: 2, + invitee: { id: 2, name: 'Andrew', lower: 'andrew', verified: false, invitedBy: null }, + }); +}); + +test.skip('Get user with invitee and custom fields + limits', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + const usersWithInvitee = await db.query.usersTable.findMany({ + extras: (users, { sql }) => ({ lower: sql`lower(${users.name})`.as('lower_name') }), + limit: 3, + with: { + invitee: { + extras: (invitee, { sql }) => ({ lower: sql`lower(${invitee.name})`.as('lower_name') }), + }, + }, + }); + + expectTypeOf(usersWithInvitee).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + lower: string; + invitedBy: number | null; + invitee: { + id: number; + name: string; + verified: boolean; + lower: string; + invitedBy: number | null; + } | null; + }[] + >(); + + usersWithInvitee.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(usersWithInvitee.length).eq(3); + expect(usersWithInvitee[0]?.invitee).toBeNull(); + expect(usersWithInvitee[1]?.invitee).toBeNull(); + expect(usersWithInvitee[2]?.invitee).not.toBeNull(); + + expect(usersWithInvitee[0]).toEqual({ + id: 1, + name: 'Dan', + lower: 'dan', + verified: false, + invitedBy: null, + invitee: null, + }); + expect(usersWithInvitee[1]).toEqual({ + id: 2, + name: 'Andrew', + lower: 'andrew', + verified: false, + invitedBy: null, + invitee: null, + }); + expect(usersWithInvitee[2]).toEqual({ + id: 3, + name: 'Alex', + lower: 'alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', lower: 'dan', verified: false, invitedBy: null }, + }); +}); + +test.skip('Get user with invitee + order by', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + const usersWithInvitee = await db.query.usersTable.findMany({ + orderBy: (users, { desc }) => [desc(users.id)], + with: { + invitee: true, + }, + }); + + expectTypeOf(usersWithInvitee).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + invitee: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + expect(usersWithInvitee.length).eq(4); + expect(usersWithInvitee[3]?.invitee).toBeNull(); + expect(usersWithInvitee[2]?.invitee).toBeNull(); + expect(usersWithInvitee[1]?.invitee).not.toBeNull(); + expect(usersWithInvitee[0]?.invitee).not.toBeNull(); + + expect(usersWithInvitee[3]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + invitee: null, + }); + expect(usersWithInvitee[2]).toEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + invitee: null, + }); + expect(usersWithInvitee[1]).toEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', verified: false, invitedBy: null }, + }); + expect(usersWithInvitee[0]).toEqual({ + id: 4, + name: 'John', + verified: false, + invitedBy: 2, + invitee: { id: 2, name: 'Andrew', verified: false, invitedBy: null }, + }); +}); + +test.skip('Get user with invitee + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + const usersWithInvitee = await db.query.usersTable.findMany({ + where: (users, { eq, or }) => (or(eq(users.id, 3), eq(users.id, 4))), + with: { + invitee: true, + }, + }); + + expectTypeOf(usersWithInvitee).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + invitee: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + expect(usersWithInvitee.length).eq(2); + expect(usersWithInvitee[0]?.invitee).not.toBeNull(); + expect(usersWithInvitee[1]?.invitee).not.toBeNull(); + + expect(usersWithInvitee).toContainEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', verified: false, invitedBy: null }, + }); + expect(usersWithInvitee).toContainEqual({ + id: 4, + name: 'John', + verified: false, + invitedBy: 2, + invitee: { id: 2, name: 'Andrew', verified: false, invitedBy: null }, + }); +}); + +test.skip('Get user with invitee + where + partial', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + const usersWithInvitee = await db.query.usersTable.findMany({ + where: (users, { eq, or }) => (or(eq(users.id, 3), eq(users.id, 4))), + columns: { + id: true, + name: true, + }, + with: { + invitee: { + columns: { + id: true, + name: true, + }, + }, + }, + }); + + expectTypeOf(usersWithInvitee).toEqualTypeOf< + { + id: number; + name: string; + invitee: { + id: number; + name: string; + } | null; + }[] + >(); + + expect(usersWithInvitee.length).eq(2); + expect(usersWithInvitee[0]?.invitee).not.toBeNull(); + expect(usersWithInvitee[1]?.invitee).not.toBeNull(); + + expect(usersWithInvitee).toContainEqual({ + id: 3, + name: 'Alex', + invitee: { id: 1, name: 'Dan' }, + }); + expect(usersWithInvitee).toContainEqual({ + id: 4, + name: 'John', + invitee: { id: 2, name: 'Andrew' }, + }); +}); + +test.skip('Get user with invitee + where + partial. Did not select users id, but used it in where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + const usersWithInvitee = await db.query.usersTable.findMany({ + where: (users, { eq, or }) => (or(eq(users.id, 3), eq(users.id, 4))), + columns: { + name: true, + }, + with: { + invitee: { + columns: { + id: true, + name: true, + }, + }, + }, + }); + + expectTypeOf(usersWithInvitee).toEqualTypeOf< + { + name: string; + invitee: { + id: number; + name: string; + } | null; + }[] + >(); + + expect(usersWithInvitee.length).eq(2); + expect(usersWithInvitee[0]?.invitee).not.toBeNull(); + expect(usersWithInvitee[1]?.invitee).not.toBeNull(); + + expect(usersWithInvitee).toContainEqual({ + name: 'Alex', + invitee: { id: 1, name: 'Dan' }, + }); + expect(usersWithInvitee).toContainEqual({ + name: 'John', + invitee: { id: 2, name: 'Andrew' }, + }); +}); + +test.skip('Get user with invitee + where + partial(true+false)', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + const usersWithInvitee = await db.query.usersTable.findMany({ + where: (users, { eq, or }) => (or(eq(users.id, 3), eq(users.id, 4))), + columns: { + id: true, + name: true, + verified: false, + }, + with: { + invitee: { + columns: { + id: true, + name: true, + verified: false, + }, + }, + }, + }); + + expectTypeOf(usersWithInvitee).toEqualTypeOf< + { + id: number; + name: string; + invitee: { + id: number; + name: string; + } | null; + }[] + >(); + + expect(usersWithInvitee.length).eq(2); + expect(usersWithInvitee[0]?.invitee).not.toBeNull(); + expect(usersWithInvitee[1]?.invitee).not.toBeNull(); + + expect(usersWithInvitee).toContainEqual({ + id: 3, + name: 'Alex', + invitee: { id: 1, name: 'Dan' }, + }); + expect(usersWithInvitee).toContainEqual({ + id: 4, + name: 'John', + invitee: { id: 2, name: 'Andrew' }, + }); +}); + +test.skip('Get user with invitee + where + partial(false)', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + const usersWithInvitee = await db.query.usersTable.findMany({ + where: (users, { eq, or }) => (or(eq(users.id, 3), eq(users.id, 4))), + columns: { + verified: false, + }, + with: { + invitee: { + columns: { + name: false, + }, + }, + }, + }); + + expectTypeOf(usersWithInvitee).toEqualTypeOf< + { + id: number; + name: string; + invitedBy: number | null; + invitee: { + id: number; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + expect(usersWithInvitee.length).eq(2); + expect(usersWithInvitee[0]?.invitee).not.toBeNull(); + expect(usersWithInvitee[1]?.invitee).not.toBeNull(); + + expect(usersWithInvitee).toContainEqual({ + id: 3, + name: 'Alex', + invitedBy: 1, + invitee: { id: 1, verified: false, invitedBy: null }, + }); + expect(usersWithInvitee).toContainEqual({ + id: 4, + name: 'John', + invitedBy: 2, + invitee: { id: 2, verified: false, invitedBy: null }, + }); +}); + +/* + Two first-level relations users+users and users+posts +*/ + +test.skip('Get user with invitee and posts', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const response = await db.query.usersTable.findMany({ + with: { + invitee: true, + posts: true, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { id: number; ownerId: number | null; content: string; createdAt: Date }[]; + invitee: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).eq(4); + + expect(response[0]?.invitee).toBeNull(); + expect(response[1]?.invitee).toBeNull(); + expect(response[2]?.invitee).not.toBeNull(); + expect(response[3]?.invitee).not.toBeNull(); + + expect(response[0]?.posts.length).eq(1); + expect(response[1]?.posts.length).eq(1); + expect(response[2]?.posts.length).eq(1); + + expect(response).toContainEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + invitee: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: response[0]?.posts[0]?.createdAt }], + }); + expect(response).toContainEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + invitee: null, + posts: [{ id: 2, ownerId: 2, content: 'Post2', createdAt: response[1]?.posts[0]?.createdAt }], + }); + expect(response).toContainEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', verified: false, invitedBy: null }, + posts: [{ id: 3, ownerId: 3, content: 'Post3', createdAt: response[2]?.posts[0]?.createdAt }], + }); + expect(response).toContainEqual({ + id: 4, + name: 'John', + verified: false, + invitedBy: 2, + invitee: { id: 2, name: 'Andrew', verified: false, invitedBy: null }, + posts: [], + }); +}); + +test.skip('Get user with invitee and posts + limit posts and users', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const response = await db.query.usersTable.findMany({ + limit: 3, + with: { + invitee: true, + posts: { + limit: 1, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { id: number; ownerId: number | null; content: string; createdAt: Date }[]; + invitee: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).eq(3); + + expect(response[0]?.invitee).toBeNull(); + expect(response[1]?.invitee).toBeNull(); + expect(response[2]?.invitee).not.toBeNull(); + + expect(response[0]?.posts.length).eq(1); + expect(response[1]?.posts.length).eq(1); + expect(response[2]?.posts.length).eq(1); + + expect(response).toContainEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + invitee: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', createdAt: response[0]?.posts[0]?.createdAt }], + }); + expect(response).toContainEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + invitee: null, + posts: [{ id: 3, ownerId: 2, content: 'Post2', createdAt: response[1]?.posts[0]?.createdAt }], + }); + expect(response).toContainEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', verified: false, invitedBy: null }, + posts: [{ id: 5, ownerId: 3, content: 'Post3', createdAt: response[2]?.posts[0]?.createdAt }], + }); +}); + +test.skip('Get user with invitee and posts + limits + custom fields in each', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const response = await db.query.usersTable.findMany({ + limit: 3, + extras: (users, { sql }) => ({ lower: sql`lower(${users.name})`.as('lower_name') }), + with: { + invitee: { + extras: (users, { sql }) => ({ lower: sql`lower(${users.name})`.as('lower_invitee_name') }), + }, + posts: { + limit: 1, + extras: (posts, { sql }) => ({ lower: sql`lower(${posts.content})`.as('lower_content') }), + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + lower: string; + invitedBy: number | null; + posts: { id: number; lower: string; ownerId: number | null; content: string; createdAt: Date }[]; + invitee: { + id: number; + name: string; + lower: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).eq(3); + + expect(response[0]?.invitee).toBeNull(); + expect(response[1]?.invitee).toBeNull(); + expect(response[2]?.invitee).not.toBeNull(); + + expect(response[0]?.posts.length).eq(1); + expect(response[1]?.posts.length).eq(1); + expect(response[2]?.posts.length).eq(1); + + expect(response).toContainEqual({ + id: 1, + name: 'Dan', + lower: 'dan', + verified: false, + invitedBy: null, + invitee: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', lower: 'post1', createdAt: response[0]?.posts[0]?.createdAt }], + }); + expect(response).toContainEqual({ + id: 2, + name: 'Andrew', + lower: 'andrew', + verified: false, + invitedBy: null, + invitee: null, + posts: [{ id: 3, ownerId: 2, content: 'Post2', lower: 'post2', createdAt: response[1]?.posts[0]?.createdAt }], + }); + expect(response).toContainEqual({ + id: 3, + name: 'Alex', + lower: 'alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', lower: 'dan', verified: false, invitedBy: null }, + posts: [{ id: 5, ownerId: 3, content: 'Post3', lower: 'post3', createdAt: response[2]?.posts[0]?.createdAt }], + }); +}); + +test.skip('Get user with invitee and posts + custom fields in each', async () => { + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const response = await db.query.usersTable.findMany({ + extras: (users, { sql }) => ({ lower: sql`lower(${users.name})`.as('lower_name') }), + with: { + invitee: { + extras: (users, { sql }) => ({ lower: sql`lower(${users.name})`.as('lower_name') }), + }, + posts: { + extras: (posts, { sql }) => ({ lower: sql`lower(${posts.content})`.as('lower_name') }), + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + lower: string; + invitedBy: number | null; + posts: { id: number; lower: string; ownerId: number | null; content: string; createdAt: Date }[]; + invitee: { + id: number; + name: string; + lower: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + response[0]?.posts.sort((a, b) => (a.id > b.id) ? 1 : -1); + response[1]?.posts.sort((a, b) => (a.id > b.id) ? 1 : -1); + response[2]?.posts.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).eq(4); + + expect(response[0]?.invitee).toBeNull(); + expect(response[1]?.invitee).toBeNull(); + expect(response[2]?.invitee).not.toBeNull(); + expect(response[3]?.invitee).not.toBeNull(); + + expect(response[0]?.posts.length).eq(2); + expect(response[1]?.posts.length).eq(2); + expect(response[2]?.posts.length).eq(2); + expect(response[3]?.posts.length).eq(0); + + expect(response).toContainEqual({ + id: 1, + name: 'Dan', + lower: 'dan', + verified: false, + invitedBy: null, + invitee: null, + posts: [{ id: 1, ownerId: 1, content: 'Post1', lower: 'post1', createdAt: response[0]?.posts[0]?.createdAt }, { + id: 2, + ownerId: 1, + content: 'Post1.1', + lower: 'post1.1', + createdAt: response[0]?.posts[1]?.createdAt, + }], + }); + expect(response).toContainEqual({ + id: 2, + name: 'Andrew', + lower: 'andrew', + verified: false, + invitedBy: null, + invitee: null, + posts: [{ id: 3, ownerId: 2, content: 'Post2', lower: 'post2', createdAt: response[1]?.posts[0]?.createdAt }, { + id: 4, + ownerId: 2, + content: 'Post2.1', + lower: 'post2.1', + createdAt: response[1]?.posts[1]?.createdAt, + }], + }); + expect(response).toContainEqual({ + id: 3, + name: 'Alex', + lower: 'alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', lower: 'dan', verified: false, invitedBy: null }, + posts: [{ id: 5, ownerId: 3, content: 'Post3', lower: 'post3', createdAt: response[2]?.posts[0]?.createdAt }, { + id: 6, + ownerId: 3, + content: 'Post3.1', + lower: 'post3.1', + createdAt: response[2]?.posts[1]?.createdAt, + }], + }); + expect(response).toContainEqual({ + id: 4, + name: 'John', + lower: 'john', + verified: false, + invitedBy: 2, + invitee: { id: 2, name: 'Andrew', lower: 'andrew', verified: false, invitedBy: null }, + posts: [], + }); +}); + +test.skip('Get user with invitee and posts + orderBy', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const response = await db.query.usersTable.findMany({ + orderBy: (users, { desc }) => [desc(users.id)], + with: { + invitee: true, + posts: { + orderBy: (posts, { desc }) => [desc(posts.id)], + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { id: number; ownerId: number | null; content: string; createdAt: Date }[]; + invitee: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + expect(response.length).eq(4); + + expect(response[3]?.invitee).toBeNull(); + expect(response[2]?.invitee).toBeNull(); + expect(response[1]?.invitee).not.toBeNull(); + expect(response[0]?.invitee).not.toBeNull(); + + expect(response[0]?.posts.length).eq(0); + expect(response[1]?.posts.length).eq(1); + expect(response[2]?.posts.length).eq(2); + expect(response[3]?.posts.length).eq(2); + + expect(response[3]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + invitee: null, + posts: [{ id: 2, ownerId: 1, content: 'Post1.1', createdAt: response[3]?.posts[0]?.createdAt }, { + id: 1, + ownerId: 1, + content: 'Post1', + createdAt: response[3]?.posts[1]?.createdAt, + }], + }); + expect(response[2]).toEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + invitee: null, + posts: [{ id: 4, ownerId: 2, content: 'Post2.1', createdAt: response[2]?.posts[0]?.createdAt }, { + id: 3, + ownerId: 2, + content: 'Post2', + createdAt: response[2]?.posts[1]?.createdAt, + }], + }); + expect(response[1]).toEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', verified: false, invitedBy: null }, + posts: [{ + id: 5, + ownerId: 3, + content: 'Post3', + createdAt: response[3]?.posts[1]?.createdAt, + }], + }); + expect(response[0]).toEqual({ + id: 4, + name: 'John', + verified: false, + invitedBy: 2, + invitee: { id: 2, name: 'Andrew', verified: false, invitedBy: null }, + posts: [], + }); +}); + +test.skip('Get user with invitee and posts + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const response = await db.query.usersTable.findMany({ + where: (users, { eq, or }) => (or(eq(users.id, 2), eq(users.id, 3))), + with: { + invitee: true, + posts: { + where: (posts, { eq }) => (eq(posts.ownerId, 2)), + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { id: number; ownerId: number | null; content: string; createdAt: Date }[]; + invitee: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).eq(2); + + expect(response[0]?.invitee).toBeNull(); + expect(response[1]?.invitee).not.toBeNull(); + + expect(response[0]?.posts.length).eq(1); + expect(response[1]?.posts.length).eq(0); + + expect(response).toContainEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + invitee: null, + posts: [{ id: 2, ownerId: 2, content: 'Post2', createdAt: response[0]?.posts[0]?.createdAt }], + }); + expect(response).toContainEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', verified: false, invitedBy: null }, + posts: [], + }); +}); + +test.skip('Get user with invitee and posts + limit posts and users + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + { ownerId: 3, content: 'Post3.1' }, + ]); + + const response = await db.query.usersTable.findMany({ + where: (users, { eq, or }) => (or(eq(users.id, 3), eq(users.id, 4))), + limit: 1, + with: { + invitee: true, + posts: { + where: (posts, { eq }) => (eq(posts.ownerId, 3)), + limit: 1, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { id: number; ownerId: number | null; content: string; createdAt: Date }[]; + invitee: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + expect(response.length).eq(1); + + expect(response[0]?.invitee).not.toBeNull(); + expect(response[0]?.posts.length).eq(1); + + expect(response).toContainEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', verified: false, invitedBy: null }, + posts: [{ id: 5, ownerId: 3, content: 'Post3', createdAt: response[0]?.posts[0]?.createdAt }], + }); +}); + +test.skip('Get user with invitee and posts + orderBy + where + custom', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const response = await db.query.usersTable.findMany({ + orderBy: [desc(usersTable.id)], + where: or(eq(usersTable.id, 3), eq(usersTable.id, 4)), + extras: { + lower: sql`lower(${usersTable.name})`.as('lower_name'), + }, + with: { + invitee: true, + posts: { + where: eq(postsTable.ownerId, 3), + orderBy: [desc(postsTable.id)], + extras: { + lower: sql`lower(${postsTable.content})`.as('lower_name'), + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + lower: string; + posts: { id: number; lower: string; ownerId: number | null; content: string; createdAt: Date }[]; + invitee: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[] + >(); + + expect(response.length).eq(2); + + expect(response[1]?.invitee).not.toBeNull(); + expect(response[0]?.invitee).not.toBeNull(); + + expect(response[0]?.posts.length).eq(0); + expect(response[1]?.posts.length).eq(1); + + expect(response[1]).toEqual({ + id: 3, + name: 'Alex', + lower: 'alex', + verified: false, + invitedBy: 1, + invitee: { id: 1, name: 'Dan', verified: false, invitedBy: null }, + posts: [{ + id: 5, + ownerId: 3, + content: 'Post3', + lower: 'post3', + createdAt: response[1]?.posts[0]?.createdAt, + }], + }); + expect(response[0]).toEqual({ + id: 4, + name: 'John', + lower: 'john', + verified: false, + invitedBy: 2, + invitee: { id: 2, name: 'Andrew', verified: false, invitedBy: null }, + posts: [], + }); +}); + +test.skip('Get user with invitee and posts + orderBy + where + partial + custom', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex', invitedBy: 1 }, + { id: 4, name: 'John', invitedBy: 2 }, + ]); + + await db.insert(postsTable).values([ + { ownerId: 1, content: 'Post1' }, + { ownerId: 1, content: 'Post1.1' }, + { ownerId: 2, content: 'Post2' }, + { ownerId: 2, content: 'Post2.1' }, + { ownerId: 3, content: 'Post3' }, + ]); + + const response = await db.query.usersTable.findMany({ + orderBy: [desc(usersTable.id)], + where: or(eq(usersTable.id, 3), eq(usersTable.id, 4)), + extras: { + lower: sql`lower(${usersTable.name})`.as('lower_name'), + }, + columns: { + id: true, + name: true, + }, + with: { + invitee: { + columns: { + id: true, + name: true, + }, + extras: { + lower: sql`lower(${usersTable.name})`.as('lower_name'), + }, + }, + posts: { + columns: { + id: true, + content: true, + }, + where: eq(postsTable.ownerId, 3), + orderBy: [desc(postsTable.id)], + extras: { + lower: sql`lower(${postsTable.content})`.as('lower_name'), + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + lower: string; + posts: { id: number; lower: string; content: string }[]; + invitee: { + id: number; + name: string; + lower: string; + } | null; + }[] + >(); + + expect(response.length).eq(2); + + expect(response[1]?.invitee).not.toBeNull(); + expect(response[0]?.invitee).not.toBeNull(); + + expect(response[0]?.posts.length).eq(0); + expect(response[1]?.posts.length).eq(1); + + expect(response[1]).toEqual({ + id: 3, + name: 'Alex', + lower: 'alex', + invitee: { id: 1, name: 'Dan', lower: 'dan' }, + posts: [{ + id: 5, + content: 'Post3', + lower: 'post3', + }], + }); + expect(response[0]).toEqual({ + id: 4, + name: 'John', + lower: 'john', + invitee: { id: 2, name: 'Andrew', lower: 'andrew' }, + posts: [], + }); +}); + +/* + One two-level relation users+posts+comments +*/ + +test.skip('Get user with posts and posts with comments', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { id: 1, ownerId: 1, content: 'Post1' }, + { id: 2, ownerId: 2, content: 'Post2' }, + { id: 3, ownerId: 3, content: 'Post3' }, + ]); + + await db.insert(commentsTable).values([ + { postId: 1, content: 'Comment1', creator: 2 }, + { postId: 2, content: 'Comment2', creator: 2 }, + { postId: 3, content: 'Comment3', creator: 3 }, + ]); + + const response = await db.query.usersTable.findMany({ + with: { + posts: { + with: { + comments: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + comments: { + id: number; + content: string; + createdAt: Date; + creator: number | null; + postId: number | null; + }[]; + }[]; + }[] + >(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).eq(3); + expect(response[0]?.posts.length).eq(1); + expect(response[1]?.posts.length).eq(1); + expect(response[2]?.posts.length).eq(1); + + expect(response[0]?.posts[0]?.comments.length).eq(1); + expect(response[1]?.posts[0]?.comments.length).eq(1); + expect(response[2]?.posts[0]?.comments.length).eq(1); + + expect(response[0]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ + id: 1, + ownerId: 1, + content: 'Post1', + createdAt: response[0]?.posts[0]?.createdAt, + comments: [ + { + id: 1, + content: 'Comment1', + creator: 2, + postId: 1, + createdAt: response[0]?.posts[0]?.comments[0]?.createdAt, + }, + ], + }], + }); + expect(response[1]).toEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + posts: [{ + id: 2, + ownerId: 2, + content: 'Post2', + createdAt: response[1]?.posts[0]?.createdAt, + comments: [ + { + id: 2, + content: 'Comment2', + creator: 2, + postId: 2, + createdAt: response[1]?.posts[0]?.comments[0]?.createdAt, + }, + ], + }], + }); + // expect(response[2]).toEqual({ + // id: 3, + // name: 'Alex', + // verified: false, + // invitedBy: null, + // posts: [{ + // id: 3, + // ownerId: 3, + // content: 'Post3', + // createdAt: response[2]?.posts[0]?.createdAt, + // comments: [ + // { + // id: , + // content: 'Comment3', + // creator: 3, + // postId: 3, + // createdAt: response[2]?.posts[0]?.comments[0]?.createdAt, + // }, + // ], + // }], + // }); +}); + +// Get user with limit posts and limit comments + +// Get user with custom field + post + comment with custom field + +// Get user with limit + posts orderBy + comment orderBy + +// Get user with where + posts where + comment where + +// Get user with where + posts partial where + comment where + +// Get user with where + posts partial where + comment partial(false) where + +// Get user with where partial(false) + posts partial where partial(false) + comment partial(false+true) where + +// Get user with where + posts partial where + comment where. Didn't select field from where in posts + +// Get user with where + posts partial where + comment where. Didn't select field from where for all + +// Get with limit+offset in each + +/* + One two-level + One first-level relation users+posts+comments and users+users +*/ + +/* + One three-level relation users+posts+comments+comment_owner +*/ + +test.skip('Get user with posts and posts with comments and comments with owner', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { id: 1, ownerId: 1, content: 'Post1' }, + { id: 2, ownerId: 2, content: 'Post2' }, + { id: 3, ownerId: 3, content: 'Post3' }, + ]); + + await db.insert(commentsTable).values([ + { postId: 1, content: 'Comment1', creator: 2 }, + { postId: 2, content: 'Comment2', creator: 2 }, + { postId: 3, content: 'Comment3', creator: 3 }, + ]); + + const response = await db.query.usersTable.findMany({ + with: { + posts: { + with: { + comments: { + with: { + author: true, + }, + }, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + comments: { + id: number; + content: string; + createdAt: Date; + creator: number | null; + postId: number | null; + author: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[]; + }[]; + }[]>(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).eq(3); + expect(response[0]?.posts.length).eq(1); + expect(response[1]?.posts.length).eq(1); + expect(response[2]?.posts.length).eq(1); + + expect(response[0]?.posts[0]?.comments.length).eq(1); + expect(response[1]?.posts[0]?.comments.length).eq(1); + expect(response[2]?.posts[0]?.comments.length).eq(1); + + expect(response[0]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ + id: 1, + ownerId: 1, + content: 'Post1', + createdAt: response[0]?.posts[0]?.createdAt, + comments: [ + { + id: 1, + content: 'Comment1', + creator: 2, + author: { + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + }, + postId: 1, + createdAt: response[0]?.posts[0]?.comments[0]?.createdAt, + }, + ], + }], + }); + expect(response[1]).toEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + posts: [{ + id: 2, + ownerId: 2, + content: 'Post2', + createdAt: response[1]?.posts[0]?.createdAt, + comments: [ + { + id: 2, + content: 'Comment2', + creator: 2, + author: { + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + }, + postId: 2, + createdAt: response[1]?.posts[0]?.comments[0]?.createdAt, + }, + ], + }], + }); +}); + +test.skip('Get user with posts and posts with comments and comments with owner where exists', async () => { + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(postsTable).values([ + { id: 1, ownerId: 1, content: 'Post1' }, + { id: 2, ownerId: 2, content: 'Post2' }, + { id: 3, ownerId: 3, content: 'Post3' }, + ]); + + await db.insert(commentsTable).values([ + { postId: 1, content: 'Comment1', creator: 2 }, + { postId: 2, content: 'Comment2', creator: 2 }, + { postId: 3, content: 'Comment3', creator: 3 }, + ]); + + const response = await db.query.usersTable.findMany({ + with: { + posts: { + with: { + comments: { + with: { + author: true, + }, + }, + }, + }, + }, + where: (table, { exists, eq }) => exists(db.select({ one: sql`1` }).from(usersTable).where(eq(sql`1`, table.id))), + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + posts: { + id: number; + content: string; + ownerId: number | null; + createdAt: Date; + comments: { + id: number; + content: string; + createdAt: Date; + creator: number | null; + postId: number | null; + author: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + } | null; + }[]; + }[]; + }[]>(); + + expect(response.length).eq(1); + expect(response[0]?.posts.length).eq(1); + + expect(response[0]?.posts[0]?.comments.length).eq(1); + + expect(response[0]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + posts: [{ + id: 1, + ownerId: 1, + content: 'Post1', + createdAt: response[0]?.posts[0]?.createdAt, + comments: [ + { + id: 1, + content: 'Comment1', + creator: 2, + author: { + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + }, + postId: 1, + createdAt: response[0]?.posts[0]?.comments[0]?.createdAt, + }, + ], + }], + }); +}); + +/* + One three-level relation + 1 first-level relatioon + 1. users+posts+comments+comment_owner + 2. users+users +*/ + +/* + One four-level relation users+posts+comments+coment_likes +*/ + +/* + [Find Many] Many-to-many cases + + Users+users_to_groups+groups +*/ + +test.skip('[Find Many] Get users with groups', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findMany({ + with: { + usersToGroups: { + columns: {}, + with: { + group: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + }; + }[]; + }[]>(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).toEqual(3); + + expect(response[0]?.usersToGroups.length).toEqual(1); + expect(response[1]?.usersToGroups.length).toEqual(1); + expect(response[2]?.usersToGroups.length).toEqual(2); + + expect(response).toContainEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 1, + name: 'Group1', + description: null, + }, + }], + }); + + expect(response).toContainEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 2, + name: 'Group2', + description: null, + }, + }], + }); + + expect(response).toContainEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 3, + name: 'Group3', + description: null, + }, + }, { + group: { + id: 2, + name: 'Group2', + description: null, + }, + }], + }); +}); + +test.skip('[Find Many] Get groups with users', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.groupsTable.findMany({ + with: { + usersToGroups: { + columns: {}, + with: { + user: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + description: string | null; + usersToGroups: { + user: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + }; + }[]; + }[]>(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).toEqual(3); + + expect(response[0]?.usersToGroups.length).toEqual(1); + expect(response[1]?.usersToGroups.length).toEqual(2); + expect(response[2]?.usersToGroups.length).toEqual(1); + + expect(response).toContainEqual({ + id: 1, + name: 'Group1', + description: null, + usersToGroups: [{ + user: { + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + }, + }], + }); + + expect(response).toContainEqual({ + id: 2, + name: 'Group2', + description: null, + usersToGroups: [{ + user: { + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + }, + }, { + user: { + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + }, + }], + }); + + expect(response).toContainEqual({ + id: 3, + name: 'Group3', + description: null, + usersToGroups: [{ + user: { + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + }, + }], + }); +}); + +test.skip('[Find Many] Get users with groups + limit', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 2, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findMany({ + limit: 2, + with: { + usersToGroups: { + limit: 1, + columns: {}, + with: { + group: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + }; + }[]; + }[]>(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).toEqual(2); + + expect(response[0]?.usersToGroups.length).toEqual(1); + expect(response[1]?.usersToGroups.length).toEqual(1); + + expect(response).toContainEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 1, + name: 'Group1', + description: null, + }, + }], + }); + + expect(response).toContainEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 2, + name: 'Group2', + description: null, + }, + }], + }); +}); + +test.skip('[Find Many] Get groups with users + limit', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.groupsTable.findMany({ + limit: 2, + with: { + usersToGroups: { + limit: 1, + columns: {}, + with: { + user: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + description: string | null; + usersToGroups: { + user: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + }; + }[]; + }[]>(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).toEqual(2); + + expect(response[0]?.usersToGroups.length).toEqual(1); + expect(response[1]?.usersToGroups.length).toEqual(1); + + expect(response).toContainEqual({ + id: 1, + name: 'Group1', + description: null, + usersToGroups: [{ + user: { + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + }, + }], + }); + + expect(response).toContainEqual({ + id: 2, + name: 'Group2', + description: null, + usersToGroups: [{ + user: { + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + }, + }], + }); +}); + +test.skip('[Find Many] Get users with groups + limit + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 2, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findMany({ + limit: 1, + where: (_, { eq, or }) => or(eq(usersTable.id, 1), eq(usersTable.id, 2)), + with: { + usersToGroups: { + where: eq(usersToGroupsTable.groupId, 1), + columns: {}, + with: { + group: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + }; + }[]; + }[]>(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).toEqual(1); + + expect(response[0]?.usersToGroups.length).toEqual(1); + + expect(response).toContainEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 1, + name: 'Group1', + description: null, + }, + }], + }); +}); + +test.skip('[Find Many] Get groups with users + limit + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.groupsTable.findMany({ + limit: 1, + where: gt(groupsTable.id, 1), + with: { + usersToGroups: { + where: eq(usersToGroupsTable.userId, 2), + limit: 1, + columns: {}, + with: { + user: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + description: string | null; + usersToGroups: { + user: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + }; + }[]; + }[]>(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).toEqual(1); + + expect(response[0]?.usersToGroups.length).toEqual(1); + + expect(response).toContainEqual({ + id: 2, + name: 'Group2', + description: null, + usersToGroups: [{ + user: { + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + }, + }], + }); +}); + +test.skip('[Find Many] Get users with groups + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 2, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findMany({ + where: (_, { eq, or }) => or(eq(usersTable.id, 1), eq(usersTable.id, 2)), + with: { + usersToGroups: { + where: eq(usersToGroupsTable.groupId, 2), + columns: {}, + with: { + group: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + }; + }[]; + }[]>(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).toEqual(2); + + expect(response[0]?.usersToGroups.length).toEqual(0); + expect(response[1]?.usersToGroups.length).toEqual(1); + + expect(response).toContainEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + usersToGroups: [], + }); + + expect(response).toContainEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 2, + name: 'Group2', + description: null, + }, + }], + }); +}); + +test.skip('[Find Many] Get groups with users + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.groupsTable.findMany({ + where: gt(groupsTable.id, 1), + with: { + usersToGroups: { + where: eq(usersToGroupsTable.userId, 2), + columns: {}, + with: { + user: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + description: string | null; + usersToGroups: { + user: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + }; + }[]; + }[]>(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).toEqual(2); + + expect(response[0]?.usersToGroups.length).toEqual(1); + expect(response[1]?.usersToGroups.length).toEqual(0); + + expect(response).toContainEqual({ + id: 2, + name: 'Group2', + description: null, + usersToGroups: [{ + user: { + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + }, + }], + }); + + expect(response).toContainEqual({ + id: 3, + name: 'Group3', + description: null, + usersToGroups: [], + }); +}); + +test.skip('[Find Many] Get users with groups + orderBy', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findMany({ + orderBy: (users, { desc }) => [desc(users.id)], + with: { + usersToGroups: { + orderBy: [desc(usersToGroupsTable.groupId)], + columns: {}, + with: { + group: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + }; + }[]; + }[]>(); + + expect(response.length).toEqual(3); + + expect(response[0]?.usersToGroups.length).toEqual(2); + expect(response[1]?.usersToGroups.length).toEqual(1); + expect(response[2]?.usersToGroups.length).toEqual(1); + + expect(response[2]).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 1, + name: 'Group1', + description: null, + }, + }], + }); + + expect(response[1]).toEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 2, + name: 'Group2', + description: null, + }, + }], + }); + + expect(response[0]).toEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 3, + name: 'Group3', + description: null, + }, + }, { + group: { + id: 2, + name: 'Group2', + description: null, + }, + }], + }); +}); + +test.skip('[Find Many] Get groups with users + orderBy', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.groupsTable.findMany({ + orderBy: [desc(groupsTable.id)], + with: { + usersToGroups: { + orderBy: (utg, { desc }) => [desc(utg.userId)], + columns: {}, + with: { + user: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + description: string | null; + usersToGroups: { + user: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + }; + }[]; + }[]>(); + + expect(response.length).toEqual(3); + + expect(response[0]?.usersToGroups.length).toEqual(1); + expect(response[1]?.usersToGroups.length).toEqual(2); + expect(response[2]?.usersToGroups.length).toEqual(1); + + expect(response[2]).toEqual({ + id: 1, + name: 'Group1', + description: null, + usersToGroups: [{ + user: { + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + }, + }], + }); + + expect(response[1]).toEqual({ + id: 2, + name: 'Group2', + description: null, + usersToGroups: [{ + user: { + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + }, + }, { + user: { + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + }, + }], + }); + + expect(response[0]).toEqual({ + id: 3, + name: 'Group3', + description: null, + usersToGroups: [{ + user: { + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + }, + }], + }); +}); + +test.skip('[Find Many] Get users with groups + orderBy + limit', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findMany({ + orderBy: (users, { desc }) => [desc(users.id)], + limit: 2, + with: { + usersToGroups: { + limit: 1, + orderBy: [desc(usersToGroupsTable.groupId)], + columns: {}, + with: { + group: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf<{ + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + }; + }[]; + }[]>(); + + expect(response.length).toEqual(2); + + expect(response[0]?.usersToGroups.length).toEqual(1); + expect(response[1]?.usersToGroups.length).toEqual(1); + + expect(response[1]).toEqual({ + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 2, + name: 'Group2', + description: null, + }, + }], + }); + + expect(response[0]).toEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 3, + name: 'Group3', + description: null, + }, + }], + }); +}); + +/* + [Find One] Many-to-many cases + + Users+users_to_groups+groups +*/ + +test.skip('[Find One] Get users with groups', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findFirst({ + with: { + usersToGroups: { + columns: {}, + with: { + group: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + }; + }[]; + } | undefined + >(); + + expect(response?.usersToGroups.length).toEqual(1); + + expect(response).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 1, + name: 'Group1', + description: null, + }, + }], + }); +}); + +test.skip('[Find One] Get groups with users', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.groupsTable.findFirst({ + with: { + usersToGroups: { + columns: {}, + with: { + user: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + description: string | null; + usersToGroups: { + user: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + }; + }[]; + } | undefined + >(); + + expect(response?.usersToGroups.length).toEqual(1); + + expect(response).toEqual({ + id: 1, + name: 'Group1', + description: null, + usersToGroups: [{ + user: { + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + }, + }], + }); +}); + +test.skip('[Find One] Get users with groups + limit', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 2, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findFirst({ + with: { + usersToGroups: { + limit: 1, + columns: {}, + with: { + group: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + }; + }[]; + } | undefined + >(); + + expect(response?.usersToGroups.length).toEqual(1); + + expect(response).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 1, + name: 'Group1', + description: null, + }, + }], + }); +}); + +test.skip('[Find One] Get groups with users + limit', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.groupsTable.findFirst({ + with: { + usersToGroups: { + limit: 1, + columns: {}, + with: { + user: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + description: string | null; + usersToGroups: { + user: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + }; + }[]; + } | undefined + >(); + + expect(response?.usersToGroups.length).toEqual(1); + + expect(response).toEqual({ + id: 1, + name: 'Group1', + description: null, + usersToGroups: [{ + user: { + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + }, + }], + }); +}); + +test.skip('[Find One] Get users with groups + limit + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 2, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findFirst({ + where: (_, { eq, or }) => or(eq(usersTable.id, 1), eq(usersTable.id, 2)), + with: { + usersToGroups: { + where: eq(usersToGroupsTable.groupId, 1), + columns: {}, + with: { + group: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + }; + }[]; + } | undefined + >(); + + expect(response?.usersToGroups.length).toEqual(1); + + expect(response).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 1, + name: 'Group1', + description: null, + }, + }], + }); +}); + +test.skip('[Find One] Get groups with users + limit + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.groupsTable.findFirst({ + where: gt(groupsTable.id, 1), + with: { + usersToGroups: { + where: eq(usersToGroupsTable.userId, 2), + limit: 1, + columns: {}, + with: { + user: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + description: string | null; + usersToGroups: { + user: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + }; + }[]; + } | undefined + >(); + + expect(response?.usersToGroups.length).toEqual(1); + + expect(response).toEqual({ + id: 2, + name: 'Group2', + description: null, + usersToGroups: [{ + user: { + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + }, + }], + }); +}); + +test.skip('[Find One] Get users with groups + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 2, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findFirst({ + where: (_, { eq, or }) => or(eq(usersTable.id, 1), eq(usersTable.id, 2)), + with: { + usersToGroups: { + where: eq(usersToGroupsTable.groupId, 2), + columns: {}, + with: { + group: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + }; + }[]; + } | undefined + >(); + + expect(response?.usersToGroups.length).toEqual(0); + + expect(response).toEqual({ + id: 1, + name: 'Dan', + verified: false, + invitedBy: null, + usersToGroups: [], + }); +}); + +test.skip('[Find One] Get groups with users + where', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.groupsTable.findFirst({ + where: gt(groupsTable.id, 1), + with: { + usersToGroups: { + where: eq(usersToGroupsTable.userId, 2), + columns: {}, + with: { + user: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + description: string | null; + usersToGroups: { + user: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + }; + }[]; + } | undefined + >(); + + expect(response?.usersToGroups.length).toEqual(1); + + expect(response).toEqual({ + id: 2, + name: 'Group2', + description: null, + usersToGroups: [{ + user: { + id: 2, + name: 'Andrew', + verified: false, + invitedBy: null, + }, + }], + }); +}); + +test.skip('[Find One] Get users with groups + orderBy', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findFirst({ + orderBy: (users, { desc }) => [desc(users.id)], + with: { + usersToGroups: { + orderBy: [desc(usersToGroupsTable.groupId)], + columns: {}, + with: { + group: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + }; + }[]; + } | undefined + >(); + + expect(response?.usersToGroups.length).toEqual(2); + + expect(response).toEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 3, + name: 'Group3', + description: null, + }, + }, { + group: { + id: 2, + name: 'Group2', + description: null, + }, + }], + }); +}); + +test.skip('[Find One] Get groups with users + orderBy', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.groupsTable.findFirst({ + orderBy: [desc(groupsTable.id)], + with: { + usersToGroups: { + orderBy: (utg, { desc }) => [desc(utg.userId)], + columns: {}, + with: { + user: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + description: string | null; + usersToGroups: { + user: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + }; + }[]; + } | undefined + >(); + + expect(response?.usersToGroups.length).toEqual(1); + + expect(response).toEqual({ + id: 3, + name: 'Group3', + description: null, + usersToGroups: [{ + user: { + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + }, + }], + }); +}); + +test.skip('[Find One] Get users with groups + orderBy + limit', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findFirst({ + orderBy: (users, { desc }) => [desc(users.id)], + with: { + usersToGroups: { + limit: 1, + orderBy: [desc(usersToGroupsTable.groupId)], + columns: {}, + with: { + group: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + }; + }[]; + } | undefined + >(); + + expect(response?.usersToGroups.length).toEqual(1); + + expect(response).toEqual({ + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 3, + name: 'Group3', + description: null, + }, + }], + }); +}); + +test.skip('Get groups with users + orderBy + limit', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.groupsTable.findMany({ + orderBy: [desc(groupsTable.id)], + limit: 2, + with: { + usersToGroups: { + limit: 1, + orderBy: (utg, { desc }) => [desc(utg.userId)], + columns: {}, + with: { + user: true, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + description: string | null; + usersToGroups: { + user: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + }; + }[]; + }[] + >(); + + expect(response.length).toEqual(2); + + expect(response[0]?.usersToGroups.length).toEqual(1); + expect(response[1]?.usersToGroups.length).toEqual(1); + + expect(response[1]).toEqual({ + id: 2, + name: 'Group2', + description: null, + usersToGroups: [{ + user: { + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + }, + }], + }); + + expect(response[0]).toEqual({ + id: 3, + name: 'Group3', + description: null, + usersToGroups: [{ + user: { + id: 3, + name: 'Alex', + verified: false, + invitedBy: null, + }, + }], + }); +}); + +test.skip('Get users with groups + custom', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.usersTable.findMany({ + extras: { + lower: sql`lower(${usersTable.name})`.as('lower_name'), + }, + with: { + usersToGroups: { + columns: {}, + with: { + group: { + extras: { + lower: sql`lower(${groupsTable.name})`.as('lower_name'), + }, + }, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + lower: string; + usersToGroups: { + group: { + id: number; + name: string; + description: string | null; + lower: string; + }; + }[]; + }[] + >(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).toEqual(3); + + expect(response[0]?.usersToGroups.length).toEqual(1); + expect(response[1]?.usersToGroups.length).toEqual(1); + expect(response[2]?.usersToGroups.length).toEqual(2); + + expect(response).toContainEqual({ + id: 1, + name: 'Dan', + lower: 'dan', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 1, + name: 'Group1', + lower: 'group1', + description: null, + }, + }], + }); + + expect(response).toContainEqual({ + id: 2, + name: 'Andrew', + lower: 'andrew', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 2, + name: 'Group2', + lower: 'group2', + description: null, + }, + }], + }); + + expect(response).toContainEqual({ + id: 3, + name: 'Alex', + lower: 'alex', + verified: false, + invitedBy: null, + usersToGroups: [{ + group: { + id: 3, + name: 'Group3', + lower: 'group3', + description: null, + }, + }, { + group: { + id: 2, + name: 'Group2', + lower: 'group2', + description: null, + }, + }], + }); +}); + +test.skip('Get groups with users + custom', async (t) => { + const { singlestoreDb: db } = t; + + await db.insert(usersTable).values([ + { id: 1, name: 'Dan' }, + { id: 2, name: 'Andrew' }, + { id: 3, name: 'Alex' }, + ]); + + await db.insert(groupsTable).values([ + { id: 1, name: 'Group1' }, + { id: 2, name: 'Group2' }, + { id: 3, name: 'Group3' }, + ]); + + await db.insert(usersToGroupsTable).values([ + { userId: 1, groupId: 1 }, + { userId: 2, groupId: 2 }, + { userId: 3, groupId: 3 }, + { userId: 3, groupId: 2 }, + ]); + + const response = await db.query.groupsTable.findMany({ + extras: (table, { sql }) => ({ + lower: sql`lower(${table.name})`.as('lower_name'), + }), + with: { + usersToGroups: { + columns: {}, + with: { + user: { + extras: (table, { sql }) => ({ + lower: sql`lower(${table.name})`.as('lower_name'), + }), + }, + }, + }, + }, + }); + + expectTypeOf(response).toEqualTypeOf< + { + id: number; + name: string; + description: string | null; + lower: string; + usersToGroups: { + user: { + id: number; + name: string; + verified: boolean; + invitedBy: number | null; + lower: string; + }; + }[]; + }[] + >(); + + response.sort((a, b) => (a.id > b.id) ? 1 : -1); + + expect(response.length).toEqual(3); + + expect(response[0]?.usersToGroups.length).toEqual(1); + expect(response[1]?.usersToGroups.length).toEqual(2); + expect(response[2]?.usersToGroups.length).toEqual(1); + + expect(response).toContainEqual({ + id: 1, + name: 'Group1', + lower: 'group1', + description: null, + usersToGroups: [{ + user: { + id: 1, + name: 'Dan', + lower: 'dan', + verified: false, + invitedBy: null, + }, + }], + }); + + expect(response).toContainEqual({ + id: 2, + name: 'Group2', + lower: 'group2', + description: null, + usersToGroups: [{ + user: { + id: 2, + name: 'Andrew', + lower: 'andrew', + verified: false, + invitedBy: null, + }, + }, { + user: { + id: 3, + name: 'Alex', + lower: 'alex', + verified: false, + invitedBy: null, + }, + }], + }); + + expect(response).toContainEqual({ + id: 3, + name: 'Group3', + lower: 'group3', + description: null, + usersToGroups: [{ + user: { + id: 3, + name: 'Alex', + lower: 'alex', + verified: false, + invitedBy: null, + }, + }], + }); +}); + +test('.toSQL()', () => { + const query = db.query.usersTable.findFirst().toSQL(); + + expect(query).toHaveProperty('sql', expect.any(String)); + expect(query).toHaveProperty('params', expect.any(Array)); +}); + +// + custom + where + orderby + +// + custom + where + orderby + limit + +// + partial + +// + partial(false) + +// + partial + orderBy + where (all not selected) + +/* + One four-level relation users+posts+comments+coment_likes + + users+users_to_groups+groups +*/ + +/* + Really hard case + 1. users+posts+comments+coment_likes + 2. users+users_to_groups+groups + 3. users+users +*/ diff --git a/integration-tests/tests/replicas/singlestore.test.ts b/integration-tests/tests/replicas/singlestore.test.ts index 8bf3bd396..76d84c972 100644 --- a/integration-tests/tests/replicas/singlestore.test.ts +++ b/integration-tests/tests/replicas/singlestore.test.ts @@ -1,6 +1,6 @@ import { sql } from 'drizzle-orm'; -import { boolean, singlestoreTable, serial, text, withReplicas } from 'drizzle-orm/singlestore-core'; import { drizzle } from 'drizzle-orm/singlestore'; +import { boolean, serial, singlestoreTable, text, withReplicas } from 'drizzle-orm/singlestore-core'; import { describe, expect, it, vi } from 'vitest'; const usersTable = singlestoreTable('users', { @@ -558,9 +558,9 @@ describe('[transaction] replicas singlestore', () => { describe('[findFirst] read replicas singlestore', () => { it('primary findFirst', () => { - const primaryDb = drizzle({} as any, { schema: { usersTable }}); - const read1 = drizzle({} as any, { schema: { usersTable }}); - const read2 = drizzle({} as any, { schema: { usersTable }}); + const primaryDb = drizzle({} as any, { schema: { usersTable } }); + const read1 = drizzle({} as any, { schema: { usersTable } }); + const read2 = drizzle({} as any, { schema: { usersTable } }); const db = withReplicas(primaryDb, [read1, read2]); @@ -578,9 +578,9 @@ describe('[findFirst] read replicas singlestore', () => { }); it('random replica findFirst', () => { - const primaryDb = drizzle({} as any, { schema: { usersTable }}); - const read1 = drizzle({} as any, { schema: { usersTable }}); - const read2 = drizzle({} as any, { schema: { usersTable }}); + const primaryDb = drizzle({} as any, { schema: { usersTable } }); + const read1 = drizzle({} as any, { schema: { usersTable } }); + const read2 = drizzle({} as any, { schema: { usersTable } }); const randomMockReplica = vi.fn().mockReturnValueOnce(read1).mockReturnValueOnce(read2); @@ -607,8 +607,8 @@ describe('[findFirst] read replicas singlestore', () => { }); it('single read replica findFirst', () => { - const primaryDb = drizzle({} as any, { schema: { usersTable }}); - const read1 = drizzle({} as any, { schema: { usersTable }}); + const primaryDb = drizzle({} as any, { schema: { usersTable } }); + const read1 = drizzle({} as any, { schema: { usersTable } }); const db = withReplicas(primaryDb, [read1]); @@ -625,8 +625,8 @@ describe('[findFirst] read replicas singlestore', () => { }); it('single read replica findFirst + primary findFirst', () => { - const primaryDb = drizzle({} as any, { schema: { usersTable }}); - const read1 = drizzle({} as any, { schema: { usersTable }}); + const primaryDb = drizzle({} as any, { schema: { usersTable } }); + const read1 = drizzle({} as any, { schema: { usersTable } }); const db = withReplicas(primaryDb, [read1]); @@ -644,9 +644,9 @@ describe('[findFirst] read replicas singlestore', () => { }); it('always first read findFirst', () => { - const primaryDb = drizzle({} as any, { schema: { usersTable }}); - const read1 = drizzle({} as any, { schema: { usersTable }}); - const read2 = drizzle({} as any, { schema: { usersTable }}); + const primaryDb = drizzle({} as any, { schema: { usersTable } }); + const read1 = drizzle({} as any, { schema: { usersTable } }); + const read2 = drizzle({} as any, { schema: { usersTable } }); const db = withReplicas(primaryDb, [read1, read2], (replicas) => { return replicas[0]!; @@ -670,9 +670,9 @@ describe('[findFirst] read replicas singlestore', () => { describe('[findMany] read replicas singlestore', () => { it('primary findMany', () => { - const primaryDb = drizzle({} as any, { schema: { usersTable }}); - const read1 = drizzle({} as any, { schema: { usersTable }}); - const read2 = drizzle({} as any, { schema: { usersTable }}); + const primaryDb = drizzle({} as any, { schema: { usersTable } }); + const read1 = drizzle({} as any, { schema: { usersTable } }); + const read2 = drizzle({} as any, { schema: { usersTable } }); const db = withReplicas(primaryDb, [read1, read2]); @@ -691,9 +691,9 @@ describe('[findMany] read replicas singlestore', () => { }); it('random replica findMany', () => { - const primaryDb = drizzle({} as any, { schema: { usersTable }}); - const read1 = drizzle({} as any, { schema: { usersTable }}); - const read2 = drizzle({} as any, { schema: { usersTable }}); + const primaryDb = drizzle({} as any, { schema: { usersTable } }); + const read1 = drizzle({} as any, { schema: { usersTable } }); + const read2 = drizzle({} as any, { schema: { usersTable } }); const randomMockReplica = vi.fn().mockReturnValueOnce(read1).mockReturnValueOnce(read2); @@ -724,8 +724,8 @@ describe('[findMany] read replicas singlestore', () => { }); it('single read replica findMany', () => { - const primaryDb = drizzle({} as any, { schema: { usersTable }}); - const read1 = drizzle({} as any, { schema: { usersTable }}); + const primaryDb = drizzle({} as any, { schema: { usersTable } }); + const read1 = drizzle({} as any, { schema: { usersTable } }); const db = withReplicas(primaryDb, [read1]); @@ -748,8 +748,8 @@ describe('[findMany] read replicas singlestore', () => { }); it('single read replica findMany + primary findMany', () => { - const primaryDb = drizzle({} as any, { schema: { usersTable }}); - const read1 = drizzle({} as any, { schema: { usersTable }}); + const primaryDb = drizzle({} as any, { schema: { usersTable } }); + const read1 = drizzle({} as any, { schema: { usersTable } }); const db = withReplicas(primaryDb, [read1]); @@ -774,9 +774,9 @@ describe('[findMany] read replicas singlestore', () => { }); it('always first read findMany', () => { - const primaryDb = drizzle({} as any, { schema: { usersTable }}); - const read1 = drizzle({} as any, { schema: { usersTable }}); - const read2 = drizzle({} as any, { schema: { usersTable }}); + const primaryDb = drizzle({} as any, { schema: { usersTable } }); + const read1 = drizzle({} as any, { schema: { usersTable } }); + const read2 = drizzle({} as any, { schema: { usersTable } }); const db = withReplicas(primaryDb, [read1, read2], (replicas) => { return replicas[0]!; diff --git a/integration-tests/tests/singlestore/singlestore-common.ts b/integration-tests/tests/singlestore/singlestore-common.ts index 851484f41..037c27202 100644 --- a/integration-tests/tests/singlestore/singlestore-common.ts +++ b/integration-tests/tests/singlestore/singlestore-common.ts @@ -293,7 +293,6 @@ export function tests(driver?: string) { testRunNumber += 1; console.log(`Test number: ${testRunNumber}`); - }); async function setupReturningFunctionsTest(db: SingleStoreDatabase) { @@ -2401,6 +2400,7 @@ export function tests(driver?: string) { await db.execute(sql`drop table if exists \`datestable\``); }); + // TODO (https://memsql.atlassian.net/browse/MCDB-63261) allow chaining limit and orderby in subquery test('set operations (union) from query builder with subquery', async (ctx) => { const { db } = ctx.singlestore; diff --git a/integration-tests/tests/singlestore/singlestore-proxy.test.ts b/integration-tests/tests/singlestore/singlestore-proxy.test.ts new file mode 100644 index 000000000..51dc48a4a --- /dev/null +++ b/integration-tests/tests/singlestore/singlestore-proxy.test.ts @@ -0,0 +1,140 @@ +import retry from 'async-retry'; +import type { SingleStoreRemoteDatabase } from 'drizzle-orm/singlestore-proxy'; +import { drizzle as proxyDrizzle } from 'drizzle-orm/singlestore-proxy'; +import * as mysql2 from 'mysql2/promise'; +import { afterAll, beforeAll, beforeEach } from 'vitest'; +import { skipTests } from '~/common'; +import { createDockerDB, tests } from './singlestore-common'; + +const ENABLE_LOGGING = false; + +// eslint-disable-next-line drizzle-internal/require-entity-kind +class ServerSimulator { + constructor(private db: mysql2.Connection) {} + + async query(sql: string, params: any[], method: 'all' | 'execute') { + if (method === 'all') { + try { + const result = await this.db.query({ + sql, + values: params, + rowsAsArray: true, + typeCast: function(field: any, next: any) { + if (field.type === 'TIMESTAMP' || field.type === 'DATETIME' || field.type === 'DATE') { + return field.string(); + } + return next(); + }, + }); + + return { data: result[0] as any }; + } catch (e: any) { + return { error: e }; + } + } else if (method === 'execute') { + try { + const result = await this.db.query({ + sql, + values: params, + typeCast: function(field: any, next: any) { + if (field.type === 'TIMESTAMP' || field.type === 'DATETIME' || field.type === 'DATE') { + return field.string(); + } + return next(); + }, + }); + + return { data: result as any }; + } catch (e: any) { + return { error: e }; + } + } else { + return { error: 'Unknown method value' }; + } + } + + async migrations(queries: string[]) { + await this.db.query('START TRANSACTION'); + try { + for (const query of queries) { + await this.db.query(query); + } + await this.db.query('COMMIT'); + } catch (e) { + await this.db.query('ROLLBACK'); + throw e; + } + + return {}; + } +} + +let db: SingleStoreRemoteDatabase; +let client: mysql2.Connection; +let serverSimulator: ServerSimulator; + +beforeAll(async () => { + let connectionString; + if (process.env['SINGLESTORE_CONNECTION_STRING']) { + connectionString = process.env['SINGLESTORE_CONNECTION_STRING']; + } else { + const { connectionString: conStr } = await createDockerDB(); + connectionString = conStr; + } + client = await retry(async () => { + client = await mysql2.createConnection(connectionString); + await client.connect(); + return client; + }, { + retries: 20, + factor: 1, + minTimeout: 250, + maxTimeout: 250, + randomize: false, + onRetry() { + client?.end(); + }, + }); + + await client.query(`CREATE DATABASE IF NOT EXISTS drizzle;`); + await client.changeUser({ database: 'drizzle' }); + + serverSimulator = new ServerSimulator(client); + db = proxyDrizzle(async (sql, params, method) => { + try { + const response = await serverSimulator.query(sql, params, method); + + if (response.error !== undefined) { + throw response.error; + } + + return { rows: response.data }; + } catch (e: any) { + console.error('Error from singlestore proxy server:', e.message); + throw e; + } + }, { logger: ENABLE_LOGGING }); +}); + +afterAll(async () => { + await client?.end(); +}); + +beforeEach((ctx) => { + ctx.singlestore = { + db, + }; +}); + +skipTests([ + 'select iterator w/ prepared statement', + 'select iterator', + 'nested transaction rollback', + 'nested transaction', + 'transaction rollback', + 'transaction', + 'transaction with options (set isolationLevel)', + 'migrator', +]); + +tests(); diff --git a/integration-tests/tests/sqlite/sqlite-common.ts b/integration-tests/tests/sqlite/sqlite-common.ts index be452bcf1..e8ddb86e6 100644 --- a/integration-tests/tests/sqlite/sqlite-common.ts +++ b/integration-tests/tests/sqlite/sqlite-common.ts @@ -2679,6 +2679,214 @@ export function tests() { expect(eachUser.updatedAt!.valueOf()).toBeGreaterThan(Date.now() - msDelay); } }); + + test('$count separate', async (ctx) => { + const { db } = ctx.sqlite; + + const countTestTable = sqliteTable('count_test', { + id: int('id').notNull(), + name: text('name').notNull(), + }); + + await db.run(sql`drop table if exists ${countTestTable}`); + await db.run(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = await db.$count(countTestTable); + + await db.run(sql`drop table ${countTestTable}`); + + expect(count).toStrictEqual(4); + }); + + test('$count embedded', async (ctx) => { + const { db } = ctx.sqlite; + + const countTestTable = sqliteTable('count_test', { + id: int('id').notNull(), + name: text('name').notNull(), + }); + + await db.run(sql`drop table if exists ${countTestTable}`); + await db.run(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = await db.select({ + count: db.$count(countTestTable), + }).from(countTestTable); + + await db.run(sql`drop table ${countTestTable}`); + + expect(count).toStrictEqual([ + { count: 4 }, + { count: 4 }, + { count: 4 }, + { count: 4 }, + ]); + }); + + test('$count separate reuse', async (ctx) => { + const { db } = ctx.sqlite; + + const countTestTable = sqliteTable('count_test', { + id: int('id').notNull(), + name: text('name').notNull(), + }); + + await db.run(sql`drop table if exists ${countTestTable}`); + await db.run(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = db.$count(countTestTable); + + const count1 = await count; + + await db.insert(countTestTable).values({ id: 5, name: 'fifth' }); + + const count2 = await count; + + await db.insert(countTestTable).values({ id: 6, name: 'sixth' }); + + const count3 = await count; + + await db.run(sql`drop table ${countTestTable}`); + + expect(count1).toStrictEqual(4); + expect(count2).toStrictEqual(5); + expect(count3).toStrictEqual(6); + }); + + test('$count embedded reuse', async (ctx) => { + const { db } = ctx.sqlite; + + const countTestTable = sqliteTable('count_test', { + id: int('id').notNull(), + name: text('name').notNull(), + }); + + await db.run(sql`drop table if exists ${countTestTable}`); + await db.run(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = db.select({ + count: db.$count(countTestTable), + }).from(countTestTable); + + const count1 = await count; + + await db.insert(countTestTable).values({ id: 5, name: 'fifth' }); + + const count2 = await count; + + await db.insert(countTestTable).values({ id: 6, name: 'sixth' }); + + const count3 = await count; + + await db.run(sql`drop table ${countTestTable}`); + + expect(count1).toStrictEqual([ + { count: 4 }, + { count: 4 }, + { count: 4 }, + { count: 4 }, + ]); + expect(count2).toStrictEqual([ + { count: 5 }, + { count: 5 }, + { count: 5 }, + { count: 5 }, + { count: 5 }, + ]); + expect(count3).toStrictEqual([ + { count: 6 }, + { count: 6 }, + { count: 6 }, + { count: 6 }, + { count: 6 }, + { count: 6 }, + ]); + }); + + test('$count separate with filters', async (ctx) => { + const { db } = ctx.sqlite; + + const countTestTable = sqliteTable('count_test', { + id: int('id').notNull(), + name: text('name').notNull(), + }); + + await db.run(sql`drop table if exists ${countTestTable}`); + await db.run(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = await db.$count(countTestTable, gt(countTestTable.id, 1)); + + await db.run(sql`drop table ${countTestTable}`); + + expect(count).toStrictEqual(3); + }); + + test('$count embedded with filters', async (ctx) => { + const { db } = ctx.sqlite; + + const countTestTable = sqliteTable('count_test', { + id: int('id').notNull(), + name: text('name').notNull(), + }); + + await db.run(sql`drop table if exists ${countTestTable}`); + await db.run(sql`create table ${countTestTable} (id int, name text)`); + + await db.insert(countTestTable).values([ + { id: 1, name: 'First' }, + { id: 2, name: 'Second' }, + { id: 3, name: 'Third' }, + { id: 4, name: 'Fourth' }, + ]); + + const count = await db.select({ + count: db.$count(countTestTable, gt(countTestTable.id, 1)), + }).from(countTestTable); + + await db.run(sql`drop table ${countTestTable}`); + + expect(count).toStrictEqual([ + { count: 3 }, + { count: 3 }, + { count: 3 }, + { count: 3 }, + ]); + }); }); test('table configs: unique third param', () => { diff --git a/pnpm-lock.yaml b/pnpm-lock.yaml index 4296fcc10..2870df804 100644 --- a/pnpm-lock.yaml +++ b/pnpm-lock.yaml @@ -45,7 +45,7 @@ importers: version: link:drizzle-orm/dist drizzle-orm-old: specifier: npm:drizzle-orm@^0.27.2 - version: drizzle-orm@0.27.2(@aws-sdk/client-rds-data@3.583.0)(@cloudflare/workers-types@4.20240524.0)(@libsql/client@0.5.6)(@neondatabase/serverless@0.9.3)(@opentelemetry/api@1.8.0)(@planetscale/database@1.18.0)(@types/better-sqlite3@7.6.10)(@types/pg@8.11.6)(@types/sql.js@1.4.9)(@vercel/postgres@0.8.0)(better-sqlite3@9.6.0)(bun-types@1.0.3)(knex@2.5.1(better-sqlite3@9.6.0)(mysql2@3.11.0)(pg@8.11.5)(sqlite3@5.1.7))(kysely@0.25.0)(mysql2@3.11.0)(pg@8.11.5)(postgres@3.4.4)(sql.js@1.10.3)(sqlite3@5.1.7) + version: drizzle-orm@0.27.2(@aws-sdk/client-rds-data@3.583.0)(@cloudflare/workers-types@4.20240524.0)(@libsql/client@0.10.0)(@neondatabase/serverless@0.9.3)(@opentelemetry/api@1.8.0)(@planetscale/database@1.18.0)(@types/better-sqlite3@7.6.10)(@types/pg@8.11.6)(@types/sql.js@1.4.9)(@vercel/postgres@0.8.0)(better-sqlite3@9.6.0)(bun-types@1.0.3)(knex@2.5.1(better-sqlite3@9.6.0)(mysql2@3.3.3)(pg@8.11.5)(sqlite3@5.1.7))(kysely@0.25.0)(mysql2@3.3.3)(pg@8.11.5)(postgres@3.4.4)(sql.js@1.10.3)(sqlite3@5.1.7) eslint: specifier: ^8.50.0 version: 8.50.0 @@ -123,8 +123,8 @@ importers: specifier: ^0.2.1 version: 0.2.2(hono@4.5.0)(zod@3.23.7) '@libsql/client': - specifier: ^0.4.2 - version: 0.4.3(bufferutil@4.0.8)(encoding@0.1.13)(utf-8-validate@6.0.3) + specifier: ^0.10.0 + version: 0.10.0(bufferutil@4.0.8)(utf-8-validate@6.0.3) '@neondatabase/serverless': specifier: ^0.9.1 version: 0.9.3 @@ -182,6 +182,9 @@ importers: better-sqlite3: specifier: ^9.4.3 version: 9.6.0 + bun-types: + specifier: ^0.6.6 + version: 0.6.14 camelcase: specifier: ^7.0.1 version: 7.0.1 @@ -303,8 +306,8 @@ importers: specifier: ^0.1.1 version: 0.1.5 '@libsql/client': - specifier: ^0.5.6 - version: 0.5.6(bufferutil@4.0.8)(encoding@0.1.13)(utf-8-validate@6.0.3) + specifier: ^0.10.0 + version: 0.10.0(bufferutil@4.0.8)(utf-8-validate@6.0.3) '@neondatabase/serverless': specifier: ^0.9.0 version: 0.9.0 @@ -551,9 +554,6 @@ importers: '@electric-sql/pglite': specifier: ^0.1.1 version: 0.1.5 - '@libsql/client': - specifier: ^0.5.6 - version: 0.5.6(bufferutil@4.0.8)(encoding@0.1.13)(utf-8-validate@6.0.3) '@miniflare/d1': specifier: ^2.14.2 version: 2.14.2 @@ -645,6 +645,9 @@ importers: specifier: ^3.20.2 version: 3.23.7 devDependencies: + '@libsql/client': + specifier: ^0.10.0 + version: 0.10.0(bufferutil@4.0.8)(utf-8-validate@6.0.3) '@neondatabase/serverless': specifier: 0.9.0 version: 0.9.0 @@ -2974,10 +2977,12 @@ packages: '@humanwhocodes/config-array@0.11.11': resolution: {integrity: sha512-N2brEuAadi0CcdeMXUkhbZB84eskAc8MEX1By6qEchoVywSgXPIjou4rYsl0V3Hj0ZnuGycGCjdNgockbzeWNA==} engines: {node: '>=10.10.0'} + deprecated: Use @eslint/config-array instead '@humanwhocodes/config-array@0.11.13': resolution: {integrity: sha512-JSBDMiDKSzQVngfRjOdFXgFfklaXI4K9nLF49Auh21lmBWRLIK3+xTErTWD4KU54pb6coM6ESE7Awz/FNU3zgQ==} engines: {node: '>=10.10.0'} + deprecated: Use @eslint/config-array instead '@humanwhocodes/config-array@0.11.14': resolution: {integrity: sha512-3T8LkOmg45BV5FICb15QQMsyUSWrQ8AygVfC7ZG32zOalnqrilm018ZVCw0eapXux8FtA33q8PSRSstjee3jSg==} @@ -2990,9 +2995,11 @@ packages: '@humanwhocodes/object-schema@1.2.1': resolution: {integrity: sha512-ZnQMnLV4e7hDlUvw8H+U8ASL02SS2Gn6+9Ac3wGGLIe7+je2AeAOxPY+izIPJDfFDb7eDjev0Us8MO1iFRN8hA==} + deprecated: Use @eslint/object-schema instead '@humanwhocodes/object-schema@2.0.1': resolution: {integrity: sha512-dvuCeX5fC9dXgJn9t+X5atfmgQAzUOWqS1254Gh0m6i8wKd10ebXkfNKiRK+1GWi/yTvvLDHpoxLr0xxxeslWw==} + deprecated: Use @eslint/object-schema instead '@humanwhocodes/object-schema@2.0.3': resolution: {integrity: sha512-93zYdMES/c1D69yZiKDBj0V24vqNzB/koF26KPaagAfd3P/4gUlh3Dys5ogAK+Exi9QyzlD8x/08Zt7wIKcDcA==} @@ -3078,94 +3085,89 @@ packages: '@jridgewell/trace-mapping@0.3.9': resolution: {integrity: sha512-3Belt6tdc8bPgAtbcmdtNJlirVoTmEb5e2gC94PnkwEW9jI6CAHUeoG85tjWP5WquqfavoMtMwiG4P926ZKKuQ==} - '@libsql/client@0.4.3': - resolution: {integrity: sha512-AUYKnSPqAsFBVWBvmtrb4dG3pQlvTKT92eztAest9wQU2iJkabH8WzHLDb3dKFWKql7/kiCqvBQUVpozDwhekQ==} + '@libsql/client@0.10.0': + resolution: {integrity: sha512-2ERn08T4XOVx34yBtUPq0RDjAdd9TJ5qNH/izugr208ml2F94mk92qC64kXyDVQINodWJvp3kAdq6P4zTtCZ7g==} - '@libsql/client@0.5.6': - resolution: {integrity: sha512-UBjmDoxz75Z2sHdP+ETCROpeLA/77VMesiff8R4UWK1rnaWbh6/YoCLDILMJL3Rh0udQeKxjL8MjXthqohax+g==} + '@libsql/core@0.10.0': + resolution: {integrity: sha512-rqynAXGaiSpTsykOZdBtI1N4z4O+KZ6mt33K/aHeXAY0gSIfK/ctxuWa0Y1Bjo4FMz1idBTCXz4Ps5kITOvZZw==} - '@libsql/core@0.4.3': - resolution: {integrity: sha512-r28iYBtaLBW9RRgXPFh6cGCsVI/rwRlOzSOpAu/1PVTm6EJ3t233pUf97jETVHU0vjdr1d8VvV6fKAvJkokqCw==} - - '@libsql/core@0.5.6': - resolution: {integrity: sha512-3vicUAydq6jPth410n4AsHHm1n2psTwvkSf94nfJlSXutGSZsl0updn2N/mJBgqUHkbuFoWZtlMifF0SwBj1xQ==} - - '@libsql/darwin-arm64@0.2.0': - resolution: {integrity: sha512-+qyT2W/n5CFH1YZWv2mxW4Fsoo4dX9Z9M/nvbQqZ7H84J8hVegvVAsIGYzcK8xAeMEcpU5yGKB1Y9NoDY4hOSQ==} + '@libsql/darwin-arm64@0.3.19': + resolution: {integrity: sha512-rmOqsLcDI65zzxlUOoEiPJLhqmbFsZF6p4UJQ2kMqB+Kc0Rt5/A1OAdOZ/Wo8fQfJWjR1IbkbpEINFioyKf+nQ==} cpu: [arm64] os: [darwin] - '@libsql/darwin-arm64@0.3.18': - resolution: {integrity: sha512-Zt49dt+cwhPCkuoWgvjbQd4ckNfCJR5xzIAyhgHl3CBZqZaEuaXTOGKLNQT7bnFRPuQcdLt5PBT1cenKu2N6pA==} + '@libsql/darwin-arm64@0.4.1': + resolution: {integrity: sha512-XICT9/OyU8Aa9Iv1xZIHgvM09n/1OQUk3VC+s5uavzdiGHrDMkOWzN47JN7/FiMa/NWrcgoEiDMk3+e7mE53Ig==} cpu: [arm64] os: [darwin] - '@libsql/darwin-x64@0.2.0': - resolution: {integrity: sha512-hwmO2mF1n8oDHKFrUju6Jv+n9iFtTf5JUK+xlnIE3Td0ZwGC/O1R/Z/btZTd9nD+vsvakC8SJT7/Q6YlWIkhEw==} + '@libsql/darwin-x64@0.3.19': + resolution: {integrity: sha512-q9O55B646zU+644SMmOQL3FIfpmEvdWpRpzubwFc2trsa+zoBlSkHuzU9v/C+UNoPHQVRMP7KQctJ455I/h/xw==} cpu: [x64] os: [darwin] - '@libsql/darwin-x64@0.3.18': - resolution: {integrity: sha512-faq6HUGDaNaueeqPei5cypHaD/hhazUyfHo094CXiEeRZq6ZKtNl5PHdlr8jE/Uw8USNpVVQaLdnvSgKcpRPHw==} + '@libsql/darwin-x64@0.4.1': + resolution: {integrity: sha512-pSKxhRrhu4SsTD+IBRZXcs1SkwMdeAG1tv6Z/Ctp/sOEYrgkU8MDKLqkOr9NsmwpK4S0+JdwjkLMyhTkct/5TQ==} cpu: [x64] os: [darwin] - '@libsql/hrana-client@0.5.6': - resolution: {integrity: sha512-mjQoAmejZ1atG+M3YR2ZW+rg6ceBByH/S/h17ZoYZkqbWrvohFhXyz2LFxj++ARMoY9m6w3RJJIRdJdmnEUlFg==} + '@libsql/hrana-client@0.6.2': + resolution: {integrity: sha512-MWxgD7mXLNf9FXXiM0bc90wCjZSpErWKr5mGza7ERy2FJNNMXd7JIOv+DepBA1FQTIfI8TFO4/QDYgaQC0goNw==} - '@libsql/isomorphic-fetch@0.1.12': - resolution: {integrity: sha512-MRo4UcmjAGAa3ac56LoD5OE13m2p0lu0VEtZC2NZMcogM/jc5fU9YtMQ3qbPjFJ+u2BBjFZgMPkQaLS1dlMhpg==} + '@libsql/isomorphic-fetch@0.2.5': + resolution: {integrity: sha512-8s/B2TClEHms2yb+JGpsVRTPBfy1ih/Pq6h6gvyaNcYnMVJvgQRY7wAa8U2nD0dppbCuDU5evTNMEhrQ17ZKKg==} + engines: {node: '>=18.0.0'} '@libsql/isomorphic-ws@0.1.5': resolution: {integrity: sha512-DtLWIH29onUYR00i0GlQ3UdcTRC6EP4u9w/h9LxpUZJWRMARk6dQwZ6Jkd+QdwVpuAOrdxt18v0K2uIYR3fwFg==} - '@libsql/linux-arm64-gnu@0.2.0': - resolution: {integrity: sha512-1w2lPXIYtnBaK5t/Ej5E8x7lPiE+jP3KATI/W4yei5Z/ONJh7jQW5PJ7sYU95vTME3hWEM1FXN6kvzcpFAte7w==} + '@libsql/linux-arm64-gnu@0.3.19': + resolution: {integrity: sha512-mgeAUU1oqqh57k7I3cQyU6Trpdsdt607eFyEmH5QO7dv303ti+LjUvh1pp21QWV6WX7wZyjeJV1/VzEImB+jRg==} cpu: [arm64] os: [linux] - '@libsql/linux-arm64-gnu@0.3.18': - resolution: {integrity: sha512-5m9xtDAhoyLSV54tho9uQ2ZIDeJWc0vU3Xpe/VK4+6bpURISs23qNhXiCrZnnq3oV0hFlBfcIgQUIATmb6jD2A==} + '@libsql/linux-arm64-gnu@0.4.1': + resolution: {integrity: sha512-9lpvb24tO2qZd9nq5dlq3ESA3hSKYWBIK7lJjfiCM6f7a70AUwBY9QoPJV9q4gILIyVnR1YBGrlm50nnb+dYgw==} cpu: [arm64] os: [linux] - '@libsql/linux-arm64-musl@0.2.0': - resolution: {integrity: sha512-lkblBEJ7xuNiWNjP8DDq0rqoWccszfkUS7Efh5EjJ+GDWdCBVfh08mPofIZg0fZVLWQCY3j+VZCG1qZfATBizg==} + '@libsql/linux-arm64-musl@0.3.19': + resolution: {integrity: sha512-VEZtxghyK6zwGzU9PHohvNxthruSxBEnRrX7BSL5jQ62tN4n2JNepJ6SdzXp70pdzTfwroOj/eMwiPt94gkVRg==} cpu: [arm64] os: [linux] - '@libsql/linux-arm64-musl@0.3.18': - resolution: {integrity: sha512-oYD5+oM2gPEalp+EoR5DVQBRtdGjLsocjsRbQs5O2m4WOBJKER7VUfDYZHsifLGZoBSc11Yo6s9IR9rjGWy20w==} + '@libsql/linux-arm64-musl@0.4.1': + resolution: {integrity: sha512-lyxi+lFxE+NcBRDMQCxCtDg3c4WcKAbc9u63d5+B23Vm+UgphD9XY4seu+tGrBy1MU2tuNVix7r9S7ECpAaVrA==} cpu: [arm64] os: [linux] - '@libsql/linux-x64-gnu@0.2.0': - resolution: {integrity: sha512-+x/d289KeJydwOhhqSxKT+6MSQTCfLltzOpTzPccsvdt5fxg8CBi+gfvEJ4/XW23Sa+9bc7zodFP0i6MOlxX7w==} + '@libsql/linux-x64-gnu@0.3.19': + resolution: {integrity: sha512-2t/J7LD5w2f63wGihEO+0GxfTyYIyLGEvTFEsMO16XI5o7IS9vcSHrxsvAJs4w2Pf907uDjmc7fUfMg6L82BrQ==} cpu: [x64] os: [linux] - '@libsql/linux-x64-gnu@0.3.18': - resolution: {integrity: sha512-QDSSP60nS8KIldGE7H3bpEflQHiL1erwED6huoVJdmDFxsyDJX2CYdWUWW8Za0ZUOvUbnEWAOyMhp6j1dBbZqw==} + '@libsql/linux-x64-gnu@0.4.1': + resolution: {integrity: sha512-psvuQ3UFBEmDFV8ZHG+WkUHIJiWv+elZ+zIPvOVedlIKdxG1O+8WthWUAhFHOGnbiyzc4sAZ4c3de1oCvyHxyQ==} cpu: [x64] os: [linux] - '@libsql/linux-x64-musl@0.2.0': - resolution: {integrity: sha512-5Xn0c5A6vKf9D1ASpgk7mef//FuY7t5Lktj/eiU4n3ryxG+6WTpqstTittJUgepVjcleLPYxIhQAYeYwTYH1IQ==} + '@libsql/linux-x64-musl@0.3.19': + resolution: {integrity: sha512-BLsXyJaL8gZD8+3W2LU08lDEd9MIgGds0yPy5iNPp8tfhXx3pV/Fge2GErN0FC+nzt4DYQtjL+A9GUMglQefXQ==} cpu: [x64] os: [linux] - '@libsql/linux-x64-musl@0.3.18': - resolution: {integrity: sha512-5SXwTlaLCUPzxYyq+P0c7Ko7tcEjpd1X6RZKe1DuRFmJPg6f7j2+LrPEhMSIbqKcrl5ACUUAyoKmGZqNYwz23w==} + '@libsql/linux-x64-musl@0.4.1': + resolution: {integrity: sha512-PDidJ3AhGDqosGg3OAZzGxMFIbnuOALya4BoezJKl667AFv3x7BBQ30H81Mngsq3Fh8RkJkXSdWfL91+Txb1iA==} cpu: [x64] os: [linux] - '@libsql/win32-x64-msvc@0.2.0': - resolution: {integrity: sha512-rpK+trBIpRST15m3cMYg5aPaX7kvCIottxY7jZPINkKAaScvfbn9yulU/iZUM9YtuK96Y1ZmvwyVIK/Y5DzoMQ==} + '@libsql/win32-x64-msvc@0.3.19': + resolution: {integrity: sha512-ay1X9AobE4BpzG0XPw1gplyLZPGHIgJOovvW23gUrukRegiUP62uzhpRbKNogLlUOynyXeq//prHgPXiebUfWg==} cpu: [x64] os: [win32] - '@libsql/win32-x64-msvc@0.3.18': - resolution: {integrity: sha512-9EEIHz+e8tTbx9TMkb8ByZnzxc0pYFirK1nSbqC6cFEST95fiY0NCfQ/zAzJxe90KckbjifX6BbO69eWIi3TAg==} + '@libsql/win32-x64-msvc@0.4.1': + resolution: {integrity: sha512-IdODVqV/PrdOnHA/004uWyorZQuRsB7U7bCRCE3vXgABj3eJLJGc6cv2C6ksEaEoVxJbD8k53H4VVAGrtYwXzQ==} cpu: [x64] os: [win32] @@ -4056,9 +4058,6 @@ packages: '@types/minimist@1.2.2': resolution: {integrity: sha512-jhuKLIRrhvCPLqwPcx6INqmKeiA5EWrsCOPhrlFSrbrmU4ZMPjj5Ul/oLCMDO98XRUIwVm78xICz4EPCektzeQ==} - '@types/node-fetch@2.6.11': - resolution: {integrity: sha512-24xFj9R5+rfQJLRyM56qh+wnVSYhyXC2tkoBndtY0U+vubqNsYXGjufB2nn8Q6gt0LrARwL6UBtMCSVCwl4B1g==} - '@types/node-forge@1.3.11': resolution: {integrity: sha512-FQx220y22OKNTqaByeBGqHWYz4cl94tpcxeFdvBo3wjG6XPBuZ0BNgNZRV5J5TFmmcsJ4IzsLkmGRiQbnYsBEQ==} @@ -4509,6 +4508,7 @@ packages: are-we-there-yet@3.0.1: resolution: {integrity: sha512-QZW4EDmGwlYur0Yyf/b2uGucHQMa8aFUP7eu9ddR73vvhFyt4V0Vl3QHPcTNJ8l6qYOBdxgXdnBXQrHilfRQBg==} engines: {node: ^12.13.0 || ^14.15.0 || >=16.0.0} + deprecated: This package is no longer supported. arg@4.1.3: resolution: {integrity: sha512-58S9QDqG0Xx27YwPSt9fJxivjYl432YCwfDMfZ+71RAqUrZef7LrKQZ3LHLOwCS4FLNBplP533Zx895SeOCHvA==} @@ -4630,10 +4630,6 @@ packages: resolution: {integrity: sha512-wvUjBtSGN7+7SjNpq/9M2Tg350UZD3q62IFZLbRAR1bSMlCo1ZaeW+BJ+D090e4hIIZLBcTDWe4Mh4jvUDajzQ==} engines: {node: '>= 0.4'} - aws-ssl-profiles@1.1.1: - resolution: {integrity: sha512-+H+kuK34PfMaI9PNU/NSjBKL5hh/KDM9J72kwYeYEm0A8B1AC4fuCy3qsjnA7lxklgyXsB68yn8Z2xoZEjgwCQ==} - engines: {node: '>= 6.0.0'} - axios@1.6.8: resolution: {integrity: sha512-v/ZHtJDU39mDpyBoFVkETcd/uNdxrWRrg3bKpOKzXFA6Bvqopts6ALSMU3y6ijYxbw2B+wPrIv46egTzJXCLGQ==} @@ -6275,6 +6271,7 @@ packages: gauge@4.0.4: resolution: {integrity: sha512-f9m+BEN5jkg6a0fZjleidjN51VE1X+mPFQ2DJ0uv1V39oCLCbsGe6yjbBnp7eK7z/+GAon99a3nHuqbuuthyPg==} engines: {node: ^12.13.0 || ^14.15.0 || >=16.0.0} + deprecated: This package is no longer supported. generate-function@2.3.1: resolution: {integrity: sha512-eeB5GfMNeevm/GRYq20ShmsaGcmI81kIX2K9XQx5miC8KdHaC6Jm0qQ8ZNeGOi7wYB8OsdxKs+Y2oVuTFuVwKQ==} @@ -7092,13 +7089,12 @@ packages: resolution: {integrity: sha512-+bT2uH4E5LGE7h/n3evcS/sQlJXCpIp6ym8OWJ5eV6+67Dsql/LaaT7qJBAt2rzfoa/5QBGBhxDix1dMt2kQKQ==} engines: {node: '>= 0.8.0'} - libsql@0.2.0: - resolution: {integrity: sha512-ELBRqhpJx5Dap0187zKQnntZyk4EjlDHSrjIVL8t+fQ5e8IxbQTeYgZgigMjB1EvrETdkm0Y0VxBGhzPQ+t0Jg==} - cpu: [x64, arm64] + libsql@0.3.19: + resolution: {integrity: sha512-Aj5cQ5uk/6fHdmeW0TiXK42FqUlwx7ytmMLPSaUQPin5HKKKuUPD62MAbN4OEweGBBI7q1BekoEN4gPUEL6MZA==} os: [darwin, linux, win32] - libsql@0.3.18: - resolution: {integrity: sha512-lvhKr7WV3NLWRbXkjn/MeKqXOAqWKU0PX9QYrvDh7fneukapj+iUQ4qgJASrQyxcCrEsClXCQiiK5W6OoYPAlA==} + libsql@0.4.1: + resolution: {integrity: sha512-qZlR9Yu1zMBeLChzkE/cKfoKV3Esp9cn9Vx5Zirn4AVhDWPcjYhKwbtJcMuHehgk3mH+fJr9qW+3vesBWbQpBg==} os: [darwin, linux, win32] lighthouse-logger@1.4.2: @@ -7309,10 +7305,6 @@ packages: resolution: {integrity: sha512-MhWWlVnuab1RG5/zMRRcVGXZLCXrZTgfwMikgzCegsPnG62yDQo5JnqKkrK4jO5iKqDAZGItAqN5CtKBCBWRUA==} engines: {node: '>=16.14'} - lru-cache@9.1.2: - resolution: {integrity: sha512-ERJq3FOzJTxBbFjZ7iDs+NiK4VI9Wz+RdrrAB8dio1oV+YvdPzUEE4QNiT2VD51DkIbCYRUUzCRkssXCHqSnKQ==} - engines: {node: 14 || >=16.14} - lru-queue@0.1.0: resolution: {integrity: sha512-BpdYkt9EvGl8OfWHDQPISVpcl5xZthb+XPsbELj5AQXxIC8IriDZIQYjBJPEm5rS420sjZ0TLEzRcq5KdBhYrQ==} @@ -7626,10 +7618,6 @@ packages: resolution: {integrity: sha512-at/ZndSy3xEGJ8i0ygALh8ru9qy7gWW1cmkaqBN29JmMlIvM//MEO9y1sk/avxuwnPcfhkejkLsuPxH81BrkSg==} engines: {node: '>=0.8.0'} - mysql2@3.11.0: - resolution: {integrity: sha512-J9phbsXGvTOcRVPR95YedzVSxJecpW5A5+cQ57rhHIFXteTP10HCs+VBjS7DHIKfEaI1zQ5tlVrquCd64A6YvA==} - engines: {node: '>= 8.0'} - mysql2@3.3.3: resolution: {integrity: sha512-MxDQJztArk4JFX1PKVjDhIXRzAmVJfuqZrVU+my6NeYBAA/XZRaDw5q7vga8TNvgyy3Lv3rivBFBBuJFbsdjaw==} engines: {node: '>= 8.0'} @@ -7785,6 +7773,7 @@ packages: npmlog@6.0.2: resolution: {integrity: sha512-/vBvz5Jfr9dT/aFWd0FIRf+T/Q2WBsLENygUaFUqstqsycmZAP/t5BvFJTK0viFmSUxiUKTUplWy5vt+rvKIxg==} engines: {node: ^12.13.0 || ^14.15.0 || >=16.0.0} + deprecated: This package is no longer supported. npx-import@1.1.4: resolution: {integrity: sha512-3ShymTWOgqGyNlh5lMJAejLuIv3W1K3fbI5Ewc6YErZU3Sp0PqsNs8UIU1O8z5+KVl/Du5ag56Gza9vdorGEoA==} @@ -8302,6 +8291,9 @@ packages: bluebird: optional: true + promise-limit@2.7.0: + resolution: {integrity: sha512-7nJ6v5lnJsXwGprnGXga4wx6d1POjvi5Qmf1ivTRxTjH4Z/9Czja/UCMLVmB9N93GeWOU93XaFaEt6jbuoagNw==} + promise-retry@2.0.1: resolution: {integrity: sha512-y+WKFlBR8BGXnsNlIHFGPZmyDf3DFMoLhaflAnyZgV6rG6xu+JwesTo2Q9R6XwYmtmwAFCkAk3e35jEdoeh/3g==} engines: {node: '>=10'} @@ -8598,6 +8590,7 @@ packages: rimraf@3.0.2: resolution: {integrity: sha512-JZkJMZkAGFFPP2YqXZXPbMlMBgsxzE8ILs4lMIX/2o0L9UBw9O/Y3o6wFw/i9YLapcUJWwqbi3kdxIPdC62TIA==} + deprecated: Rimraf versions prior to v4 are no longer supported hasBin: true rimraf@5.0.0: @@ -10210,7 +10203,7 @@ snapshots: '@aws-sdk/client-sso-oidc': 3.583.0(@aws-sdk/client-sts@3.583.0) '@aws-sdk/client-sts': 3.583.0 '@aws-sdk/core': 3.582.0 - '@aws-sdk/credential-provider-node': 3.583.0(@aws-sdk/client-sso-oidc@3.583.0(@aws-sdk/client-sts@3.583.0))(@aws-sdk/client-sts@3.583.0) + '@aws-sdk/credential-provider-node': 3.583.0(@aws-sdk/client-sso-oidc@3.583.0)(@aws-sdk/client-sts@3.583.0) '@aws-sdk/middleware-host-header': 3.577.0 '@aws-sdk/middleware-logger': 3.577.0 '@aws-sdk/middleware-recursion-detection': 3.577.0 @@ -10300,7 +10293,7 @@ snapshots: '@aws-crypto/sha256-js': 3.0.0 '@aws-sdk/client-sts': 3.583.0 '@aws-sdk/core': 3.582.0 - '@aws-sdk/credential-provider-node': 3.583.0(@aws-sdk/client-sso-oidc@3.583.0(@aws-sdk/client-sts@3.583.0))(@aws-sdk/client-sts@3.583.0) + '@aws-sdk/credential-provider-node': 3.583.0(@aws-sdk/client-sso-oidc@3.583.0)(@aws-sdk/client-sts@3.583.0) '@aws-sdk/middleware-host-header': 3.577.0 '@aws-sdk/middleware-logger': 3.577.0 '@aws-sdk/middleware-recursion-detection': 3.577.0 @@ -10610,7 +10603,7 @@ snapshots: '@aws-crypto/sha256-js': 3.0.0 '@aws-sdk/client-sso-oidc': 3.583.0(@aws-sdk/client-sts@3.583.0) '@aws-sdk/core': 3.582.0 - '@aws-sdk/credential-provider-node': 3.583.0(@aws-sdk/client-sso-oidc@3.583.0(@aws-sdk/client-sts@3.583.0))(@aws-sdk/client-sts@3.583.0) + '@aws-sdk/credential-provider-node': 3.583.0(@aws-sdk/client-sso-oidc@3.583.0)(@aws-sdk/client-sts@3.583.0) '@aws-sdk/middleware-host-header': 3.577.0 '@aws-sdk/middleware-logger': 3.577.0 '@aws-sdk/middleware-recursion-detection': 3.577.0 @@ -10799,12 +10792,12 @@ snapshots: - '@aws-sdk/client-sso-oidc' - aws-crt - '@aws-sdk/credential-provider-ini@3.583.0(@aws-sdk/client-sso-oidc@3.583.0(@aws-sdk/client-sts@3.583.0))(@aws-sdk/client-sts@3.583.0)': + '@aws-sdk/credential-provider-ini@3.583.0(@aws-sdk/client-sso-oidc@3.583.0)(@aws-sdk/client-sts@3.583.0)': dependencies: '@aws-sdk/client-sts': 3.583.0 '@aws-sdk/credential-provider-env': 3.577.0 '@aws-sdk/credential-provider-process': 3.577.0 - '@aws-sdk/credential-provider-sso': 3.583.0(@aws-sdk/client-sso-oidc@3.583.0(@aws-sdk/client-sts@3.583.0)) + '@aws-sdk/credential-provider-sso': 3.583.0(@aws-sdk/client-sso-oidc@3.583.0) '@aws-sdk/credential-provider-web-identity': 3.577.0(@aws-sdk/client-sts@3.583.0) '@aws-sdk/types': 3.577.0 '@smithy/credential-provider-imds': 3.0.0 @@ -10889,13 +10882,13 @@ snapshots: - '@aws-sdk/client-sts' - aws-crt - '@aws-sdk/credential-provider-node@3.583.0(@aws-sdk/client-sso-oidc@3.583.0(@aws-sdk/client-sts@3.583.0))(@aws-sdk/client-sts@3.583.0)': + '@aws-sdk/credential-provider-node@3.583.0(@aws-sdk/client-sso-oidc@3.583.0)(@aws-sdk/client-sts@3.583.0)': dependencies: '@aws-sdk/credential-provider-env': 3.577.0 '@aws-sdk/credential-provider-http': 3.582.0 - '@aws-sdk/credential-provider-ini': 3.583.0(@aws-sdk/client-sso-oidc@3.583.0(@aws-sdk/client-sts@3.583.0))(@aws-sdk/client-sts@3.583.0) + '@aws-sdk/credential-provider-ini': 3.583.0(@aws-sdk/client-sso-oidc@3.583.0)(@aws-sdk/client-sts@3.583.0) '@aws-sdk/credential-provider-process': 3.577.0 - '@aws-sdk/credential-provider-sso': 3.583.0(@aws-sdk/client-sso-oidc@3.583.0(@aws-sdk/client-sts@3.583.0)) + '@aws-sdk/credential-provider-sso': 3.583.0(@aws-sdk/client-sso-oidc@3.583.0) '@aws-sdk/credential-provider-web-identity': 3.577.0(@aws-sdk/client-sts@3.583.0) '@aws-sdk/types': 3.577.0 '@smithy/credential-provider-imds': 3.0.0 @@ -10970,10 +10963,10 @@ snapshots: - '@aws-sdk/client-sso-oidc' - aws-crt - '@aws-sdk/credential-provider-sso@3.583.0(@aws-sdk/client-sso-oidc@3.583.0(@aws-sdk/client-sts@3.583.0))': + '@aws-sdk/credential-provider-sso@3.583.0(@aws-sdk/client-sso-oidc@3.583.0)': dependencies: '@aws-sdk/client-sso': 3.583.0 - '@aws-sdk/token-providers': 3.577.0(@aws-sdk/client-sso-oidc@3.583.0(@aws-sdk/client-sts@3.583.0)) + '@aws-sdk/token-providers': 3.577.0(@aws-sdk/client-sso-oidc@3.583.0) '@aws-sdk/types': 3.577.0 '@smithy/property-provider': 3.0.0 '@smithy/shared-ini-file-loader': 3.0.0 @@ -11216,7 +11209,7 @@ snapshots: '@smithy/types': 2.12.0 tslib: 2.6.2 - '@aws-sdk/token-providers@3.577.0(@aws-sdk/client-sso-oidc@3.583.0(@aws-sdk/client-sts@3.583.0))': + '@aws-sdk/token-providers@3.577.0(@aws-sdk/client-sso-oidc@3.583.0)': dependencies: '@aws-sdk/client-sso-oidc': 3.583.0(@aws-sdk/client-sts@3.583.0) '@aws-sdk/types': 3.577.0 @@ -13328,103 +13321,81 @@ snapshots: '@jridgewell/resolve-uri': 3.1.2 '@jridgewell/sourcemap-codec': 1.4.15 - '@libsql/client@0.4.3(bufferutil@4.0.8)(encoding@0.1.13)(utf-8-validate@6.0.3)': - dependencies: - '@libsql/core': 0.4.3 - '@libsql/hrana-client': 0.5.6(bufferutil@4.0.8)(encoding@0.1.13)(utf-8-validate@6.0.3) - js-base64: 3.7.7 - optionalDependencies: - libsql: 0.2.0 - transitivePeerDependencies: - - bufferutil - - encoding - - utf-8-validate - - '@libsql/client@0.5.6(bufferutil@4.0.8)(encoding@0.1.13)(utf-8-validate@6.0.3)': + '@libsql/client@0.10.0(bufferutil@4.0.8)(utf-8-validate@6.0.3)': dependencies: - '@libsql/core': 0.5.6 - '@libsql/hrana-client': 0.5.6(bufferutil@4.0.8)(encoding@0.1.13)(utf-8-validate@6.0.3) + '@libsql/core': 0.10.0 + '@libsql/hrana-client': 0.6.2(bufferutil@4.0.8)(utf-8-validate@6.0.3) js-base64: 3.7.7 - libsql: 0.3.18 + libsql: 0.4.1 + promise-limit: 2.7.0 transitivePeerDependencies: - bufferutil - - encoding - utf-8-validate - '@libsql/core@0.4.3': + '@libsql/core@0.10.0': dependencies: js-base64: 3.7.7 - '@libsql/core@0.5.6': - dependencies: - js-base64: 3.7.7 - - '@libsql/darwin-arm64@0.2.0': + '@libsql/darwin-arm64@0.3.19': optional: true - '@libsql/darwin-arm64@0.3.18': + '@libsql/darwin-arm64@0.4.1': optional: true - '@libsql/darwin-x64@0.2.0': + '@libsql/darwin-x64@0.3.19': optional: true - '@libsql/darwin-x64@0.3.18': + '@libsql/darwin-x64@0.4.1': optional: true - '@libsql/hrana-client@0.5.6(bufferutil@4.0.8)(encoding@0.1.13)(utf-8-validate@6.0.3)': + '@libsql/hrana-client@0.6.2(bufferutil@4.0.8)(utf-8-validate@6.0.3)': dependencies: - '@libsql/isomorphic-fetch': 0.1.12(encoding@0.1.13) + '@libsql/isomorphic-fetch': 0.2.5 '@libsql/isomorphic-ws': 0.1.5(bufferutil@4.0.8)(utf-8-validate@6.0.3) js-base64: 3.7.7 node-fetch: 3.3.2 transitivePeerDependencies: - bufferutil - - encoding - utf-8-validate - '@libsql/isomorphic-fetch@0.1.12(encoding@0.1.13)': - dependencies: - '@types/node-fetch': 2.6.11 - node-fetch: 2.7.0(encoding@0.1.13) - transitivePeerDependencies: - - encoding + '@libsql/isomorphic-fetch@0.2.5': {} '@libsql/isomorphic-ws@0.1.5(bufferutil@4.0.8)(utf-8-validate@6.0.3)': dependencies: '@types/ws': 8.5.11 - ws: 8.17.0(bufferutil@4.0.8)(utf-8-validate@6.0.3) + ws: 8.18.0(bufferutil@4.0.8)(utf-8-validate@6.0.3) transitivePeerDependencies: - bufferutil - utf-8-validate - '@libsql/linux-arm64-gnu@0.2.0': + '@libsql/linux-arm64-gnu@0.3.19': optional: true - '@libsql/linux-arm64-gnu@0.3.18': + '@libsql/linux-arm64-gnu@0.4.1': optional: true - '@libsql/linux-arm64-musl@0.2.0': + '@libsql/linux-arm64-musl@0.3.19': optional: true - '@libsql/linux-arm64-musl@0.3.18': + '@libsql/linux-arm64-musl@0.4.1': optional: true - '@libsql/linux-x64-gnu@0.2.0': + '@libsql/linux-x64-gnu@0.3.19': optional: true - '@libsql/linux-x64-gnu@0.3.18': + '@libsql/linux-x64-gnu@0.4.1': optional: true - '@libsql/linux-x64-musl@0.2.0': + '@libsql/linux-x64-musl@0.3.19': optional: true - '@libsql/linux-x64-musl@0.3.18': + '@libsql/linux-x64-musl@0.4.1': optional: true - '@libsql/win32-x64-msvc@0.2.0': + '@libsql/win32-x64-msvc@0.3.19': optional: true - '@libsql/win32-x64-msvc@0.3.18': + '@libsql/win32-x64-msvc@0.4.1': optional: true '@miniflare/core@2.14.2': @@ -14684,11 +14655,6 @@ snapshots: '@types/minimist@1.2.2': {} - '@types/node-fetch@2.6.11': - dependencies: - '@types/node': 20.12.12 - form-data: 4.0.0 - '@types/node-forge@1.3.11': dependencies: '@types/node': 20.12.12 @@ -15124,7 +15090,7 @@ snapshots: pathe: 1.1.2 picocolors: 1.0.1 sirv: 2.0.4 - vitest: 1.6.0(@types/node@18.15.10)(@vitest/ui@1.6.0)(lightningcss@1.25.1)(terser@5.31.0) + vitest: 1.6.0(@types/node@20.12.12)(@vitest/ui@1.6.0)(lightningcss@1.25.1)(terser@5.31.0) '@vitest/utils@1.6.0': dependencies: @@ -15425,9 +15391,6 @@ snapshots: dependencies: possible-typed-array-names: 1.0.0 - aws-ssl-profiles@1.1.1: - optional: true - axios@1.6.8: dependencies: follow-redirects: 1.15.6 @@ -16313,11 +16276,11 @@ snapshots: transitivePeerDependencies: - supports-color - drizzle-orm@0.27.2(@aws-sdk/client-rds-data@3.583.0)(@cloudflare/workers-types@4.20240524.0)(@libsql/client@0.5.6)(@neondatabase/serverless@0.9.3)(@opentelemetry/api@1.8.0)(@planetscale/database@1.18.0)(@types/better-sqlite3@7.6.10)(@types/pg@8.11.6)(@types/sql.js@1.4.9)(@vercel/postgres@0.8.0)(better-sqlite3@9.6.0)(bun-types@1.0.3)(knex@2.5.1(better-sqlite3@9.6.0)(mysql2@3.11.0)(pg@8.11.5)(sqlite3@5.1.7))(kysely@0.25.0)(mysql2@3.11.0)(pg@8.11.5)(postgres@3.4.4)(sql.js@1.10.3)(sqlite3@5.1.7): + drizzle-orm@0.27.2(@aws-sdk/client-rds-data@3.583.0)(@cloudflare/workers-types@4.20240524.0)(@libsql/client@0.10.0)(@neondatabase/serverless@0.9.3)(@opentelemetry/api@1.8.0)(@planetscale/database@1.18.0)(@types/better-sqlite3@7.6.10)(@types/pg@8.11.6)(@types/sql.js@1.4.9)(@vercel/postgres@0.8.0)(better-sqlite3@9.6.0)(bun-types@1.0.3)(knex@2.5.1(better-sqlite3@9.6.0)(mysql2@3.3.3)(pg@8.11.5)(sqlite3@5.1.7))(kysely@0.25.0)(mysql2@3.3.3)(pg@8.11.5)(postgres@3.4.4)(sql.js@1.10.3)(sqlite3@5.1.7): optionalDependencies: '@aws-sdk/client-rds-data': 3.583.0 '@cloudflare/workers-types': 4.20240524.0 - '@libsql/client': 0.5.6(bufferutil@4.0.8)(encoding@0.1.13)(utf-8-validate@6.0.3) + '@libsql/client': 0.10.0(bufferutil@4.0.8)(utf-8-validate@6.0.3) '@neondatabase/serverless': 0.9.3 '@opentelemetry/api': 1.8.0 '@planetscale/database': 1.18.0 @@ -16327,9 +16290,9 @@ snapshots: '@vercel/postgres': 0.8.0 better-sqlite3: 9.6.0 bun-types: 1.0.3 - knex: 2.5.1(better-sqlite3@9.6.0)(mysql2@3.11.0)(pg@8.11.5)(sqlite3@5.1.7) + knex: 2.5.1(better-sqlite3@9.6.0)(mysql2@3.3.3)(pg@8.11.5)(sqlite3@5.1.7) kysely: 0.25.0 - mysql2: 3.11.0 + mysql2: 3.3.3 pg: 8.11.5 postgres: 3.4.4 sql.js: 1.10.3 @@ -18320,7 +18283,7 @@ snapshots: transitivePeerDependencies: - supports-color - knex@2.5.1(better-sqlite3@9.6.0)(mysql2@3.11.0)(pg@8.11.5)(sqlite3@5.1.7): + knex@2.5.1(better-sqlite3@9.6.0)(mysql2@3.3.3)(pg@8.11.5)(sqlite3@5.1.7): dependencies: colorette: 2.0.19 commander: 10.0.1 @@ -18338,7 +18301,7 @@ snapshots: tildify: 2.0.0 optionalDependencies: better-sqlite3: 9.6.0 - mysql2: 3.11.0 + mysql2: 3.3.3 pg: 8.11.5 sqlite3: 5.1.7 transitivePeerDependencies: @@ -18354,32 +18317,32 @@ snapshots: prelude-ls: 1.2.1 type-check: 0.4.0 - libsql@0.2.0: + libsql@0.3.19: dependencies: '@neon-rs/load': 0.0.4 detect-libc: 2.0.2 optionalDependencies: - '@libsql/darwin-arm64': 0.2.0 - '@libsql/darwin-x64': 0.2.0 - '@libsql/linux-arm64-gnu': 0.2.0 - '@libsql/linux-arm64-musl': 0.2.0 - '@libsql/linux-x64-gnu': 0.2.0 - '@libsql/linux-x64-musl': 0.2.0 - '@libsql/win32-x64-msvc': 0.2.0 - optional: true + '@libsql/darwin-arm64': 0.3.19 + '@libsql/darwin-x64': 0.3.19 + '@libsql/linux-arm64-gnu': 0.3.19 + '@libsql/linux-arm64-musl': 0.3.19 + '@libsql/linux-x64-gnu': 0.3.19 + '@libsql/linux-x64-musl': 0.3.19 + '@libsql/win32-x64-msvc': 0.3.19 - libsql@0.3.18: + libsql@0.4.1: dependencies: '@neon-rs/load': 0.0.4 detect-libc: 2.0.2 + libsql: 0.3.19 optionalDependencies: - '@libsql/darwin-arm64': 0.3.18 - '@libsql/darwin-x64': 0.3.18 - '@libsql/linux-arm64-gnu': 0.3.18 - '@libsql/linux-arm64-musl': 0.3.18 - '@libsql/linux-x64-gnu': 0.3.18 - '@libsql/linux-x64-musl': 0.3.18 - '@libsql/win32-x64-msvc': 0.3.18 + '@libsql/darwin-arm64': 0.4.1 + '@libsql/darwin-x64': 0.4.1 + '@libsql/linux-arm64-gnu': 0.4.1 + '@libsql/linux-arm64-musl': 0.4.1 + '@libsql/linux-x64-gnu': 0.4.1 + '@libsql/linux-x64-musl': 0.4.1 + '@libsql/win32-x64-msvc': 0.4.1 lighthouse-logger@1.4.2: dependencies: @@ -18548,8 +18511,6 @@ snapshots: lru-cache@8.0.5: {} - lru-cache@9.1.2: {} - lru-queue@0.1.0: dependencies: es5-ext: 0.10.62 @@ -18988,19 +18949,6 @@ snapshots: rimraf: 2.4.5 optional: true - mysql2@3.11.0: - dependencies: - aws-ssl-profiles: 1.1.1 - denque: 2.1.0 - generate-function: 2.3.1 - iconv-lite: 0.6.3 - long: 5.2.3 - lru-cache: 8.0.5 - named-placeholders: 1.1.3 - seq-queue: 0.0.5 - sqlstring: 2.3.3 - optional: true - mysql2@3.3.3: dependencies: denque: 2.1.0 @@ -19413,7 +19361,7 @@ snapshots: path-scurry@1.10.1: dependencies: - lru-cache: 9.1.2 + lru-cache: 10.2.2 minipass: 5.0.0 path-scurry@1.11.1: @@ -19638,6 +19586,8 @@ snapshots: promise-inflight@1.0.1: optional: true + promise-limit@2.7.0: {} + promise-retry@2.0.1: dependencies: err-code: 2.0.3