diff --git a/blog/creating-synapse.md b/blog/creating-synapse.md new file mode 100644 index 0000000..8ec131b --- /dev/null +++ b/blog/creating-synapse.md @@ -0,0 +1,260 @@ +# Creating Synapse: A Compiler for the Cloud + +My name is Jaden. I used to work on developer tools at AWS before leaving to start my own company, Cohesible. For nearly a year I've been tackling the problem of making cloud-based applications easier to create and use. Because let's face it, developing for the cloud is rough. + +## The Beginning + +Where do you even start with such a complex problem? I had no clue. But I did know what kind of experience I wanted: + +> Let me easily create and use cloud resources! + +For me, using the cloud just felt so... tedious. The amount of hoops you need to jump through just to put something into an S3 bucket from a Lambda function is absurd: +* Create the S3 bucket +* Write some code, ignoring portability by hardcoding the bucket name +* Figure out how Lambda expects your code to be packaged +* Package the code +* Create the Lambda function +* Navigate the AWS console (or CLI) trying to figure out how to run the code +* Figure it out, but the code fails??? Oh, it's because you need to configure _permissions_ +* Fix permissions, now it works! +* Avoid having to change the code because you don't want to deal with that again + +If you're using the AWS console, you get the additional pleasure of waiting on page reloads _constantly_. Everything in AWS is so scattered around that navigation alone is painful. And that's one of the simpler examples of using the cloud! Things get even more fun once you start bringing in networks, observability, and cost into the equation. + +Don't get me wrong, cloud services like S3 and Lambda are _amazing_ feats of engineering. In isolation. Trying to use them together? A nightmare. + +## What Came Before + +I wasn't alone in my frustration. Countless tools have been created to address the problem of creating cloud resources. _Creating cloud resources_. But not so much interacting with them from your app. + +If you're okay with only sparingly using the cloud within your application, then creation is good enough. But that feels so limiting. I want to just _use_ the simplest, easiest, cheapest solution whenever possible. Sometimes that might be something "serverless", other times that might be running a fleet of dedicated hosts. I wanted a solution where making that choice was not a technical limitation. + +Creating such a solution is tricky because "using resources" implies a strong coupling with the application itself Traditional infrastructure-as-code (IaC) tools that use JSON or YAML just wouldn't cut it. They're good at creating the resources, not so much using them. + +So what about newer IaC tools like CDK and Pulumi? I mean, they use programming languages right? They do, but mainly to simplify creating more sophisticated infrastructure. Pulumi's "magic functions" is close to what I was imagining. JavaScript or Python functions are serialized during deployment, automatically capturing any deployed resources along the way. I was able to put something in a bucket from a Lambda function fairly easily with this: + +```ts +import * as aws from '@pulumi/aws' +import * as S3 from '@aws-sdk/client-s3' + +const bucket = new aws.s3.Bucket('my-bucket') + +aws.lambda.createFunctionFromEventHandler('my-fn', async ev => { + const client = new S3.S3() + await client.putObject({ + Bucket: bucket.bucket.get(), + Key: 'foo', + Body: 'bar', + }) +}) +``` + +But this solution still had problems: +* Overly broad permissions +* No way to bundle code +* Invoking the function is not obvious +* Putting something into the bucket requires more libraries +* Not obvious how to adapt to other cloud compute e.g. dedicated hosts + +Magic functions is a cool feature. And I wanted a fully-realized version of it. + +Tools in the IaC space tend to "go broad", offering language-agnostic solutions. This, to me, seems like a safe business strategy. But it's difficult to truly explore the problem space when spread so thin. I think this is why Pulumi's magic functions felt incomplete. For whatever I was going to build, I wanted to specialize. So I opted to go all-in on TypeScript, which is also my favorite language. + +## Hacking Away + +Initial implementations were rough to say the least. They "worked" in the sense that they let me create and use cloud resources. But only if you wrote the code _just right_. That's not what I wanted. + +### Static analysis + +My first implementations relied on static analysis. Because it felt "cleaner". Static analysis is _hard_ though. You essentially have to re-implement the language depending on how extensive your analysis is. + +Here's one of my earliest code examples: + +```ts +import * as cloud from 'cloud' + +const bucket = new cloud.Bucket() + +export function read(key: string) { + return bucket.get(key) +} + +const fn = new cloud.Function(read) +``` + +My implementation would parse this code, looking for `new` calls from the `cloud` module. Identifiers like `bucket` and `fn` are marked as cloud resources which was how I could automatically figure out what needed to be deployed. + +This analysis was brittle. What if the bucket was created within a function instead? Broken. What if `read` was created by a factory function? Broken. What if we wanted to _conditionally_ create a resource? Broken. + +But here's the thing: I was already executing the user's code in order to "compile" it. This was because I was using the CDK for Terraform which executes JavaScript to produce configuration files. There were other ways to make this work. + +### Going dynamic + +Focus shifted from doing most of the work statically to doing most of the work dynamically. Static analysis became a mechanism to prepare the code for execution. An intermediate step. The prep work allowed me to do things such as solving permissions and function serialization by executing code. + +This was a game changer. My first few implementations felt dead. Rigid. _Static_. Many things just weren't possible. But now, most things simply work. It feels alive. When I wrote some code to instrument serverless functions was when I thought "this is real now": + +```ts +export class LambdaFunction { + public constructor(target: (...args: T) => Promise | U) { + const entrypoint = new Bundle(instrument(target)) + + // ... creating IAM roles, uploading to S3, creating the Lambda function, etc. + } +} + +const logger = getCustomLogger() + +function instrument(fn: (...args: U) => T): (...args: U) => Promise { + return async (...args) => { + try { + return await fn(...args) + } catch (err) { + logger.error(err) + + throw err + } + } +} +``` + +Imagine for a moment that `logger` was some sort of logging service created using Synapse. That's what I had done. I was able to create my own service and feed it back into the code. Of course, I didn't want to automatically instrument everyone's code by default, and so the logger no longer exists. But it _could_ exist, and I think that's amazing. + +## Building a Foundation + +Abstraction is a tricky topic. Too much and you severely limit possible use-cases. Too little and you're basically doing nothing at all. In either case, you're probably getting in people's way. + +So how do you go about trying to abstract something as complex as cloud computing? In layers, of course! There is no single abstraction that is "just right" for the cloud. But what you can do is create many "just right" abstractions that build off each other. + +### Layer 0 - Defining Resources (CRUD) + +Before doing _anything_ with a resource, we need to know how to create, read, update, and delete them. The Terraform ecosystem has already done much of the work for us here. No need to re-invent the wheel. Of course, Synapse does allow hooking into this layer in a convenient way. There isn't always a Terraform provider for every possible use-case. + +### Layer 1 - Configuration + +Now we get to create/configure resources. This is where most IaC tools live today. Synapse uses a lightweight mechanism to generate bindings directly from Terraform providers. + +```ts +import * as aws from 'synapse-provider:aws' + +const bucket = new aws.S3Bucket({ forceDestroy: false }) +``` + +### Layer 2 - Interaction Models + +Here's where it gets interesting: we get to _use_ what we create! We don't necessarily abstract away cloud provider specific peculiarities here. The goal is to plumb through API calls as well as model permissions/networking. This layer is still very unpolished and often hides too much for the sake of simplicity: + +```ts +import * as aws from 'synapse-provider:aws' +import * as S3 from '@aws-sdk/client-s3' + +type Encoding = 'utf-8' | 'base64' | 'base64url' | 'hex' + +export class Bucket { + private readonly client = new S3.S3({}) + public readonly resource: aws.S3Bucket + public readonly name: string + public readonly id: string + + public constructor() { + this.resource = new aws.S3Bucket({ forceDestroy: false }) + this.id = this.resource.arn + this.name = this.resource.bucket + } + + public async get(key: string): Promise + public async get(key: string, encoding: Encoding): Promise + public async get(key: string, encoding?: Encoding): Promise { + const resp = await this.client.getObject({ Bucket: this.name, Key: key }) + const bytes = await resp.Body!.transformToByteArray() + + return !encoding ? bytes : Buffer.from(bytes).toString(encoding) + } + + public async put(key: string, blob: string | Uint8Array): Promise { + await this.client.putObject({ Bucket: this.name, Key: key, Body: blob }) + } +} +``` + +### Layer 3 - Standard Resources + +Now we distill the core functionality offered by cloud providers into a standard set of interfaces. An implementation in layer 2 is used for a given interface based on whatever cloud target is needed. + +```ts +// Simplified class from `synapse:srl/storage` + +export declare class Bucket { + get(key: string): Promise + get(key: string, encoding: Encoding): Promise + put(key: string, blob: string | Uint8Array): Promise +} +``` + + +### Extensible Layers + +A major goal with Synapse is to allow for rich customization without sacrificing the power that abstractions offer. The layers are designed in such a way that they all effectively exist as user code. In other words, they are _not_ baked into the compiler. Even the very first layer can be extended directly: + +```ts +import { defineResource } from 'synapse:core' +import { Bucket } from 'synapse:srl/storage' + +const bucket = new Bucket() + +class BucketObject extends defineResource({ + create: async (key: string, value: any) => { + await bucket.put(key, value) + + return { key } + }, + delete: async (state) => { + await bucket.delete(state.key) + }, +}) {} + +const obj = new BucketObject('foo', 'bar') + +export async function main() { + console.log(await bucket.get(obj.key, 'utf-8')) +} +``` + +One thing to point out here is how _fluid_ the layers are. We're able to take an abstract `Bucket` and define an entirely new resource using it. `BucketObject` works the same whether `bucket` uses AWS or Azure! + +## Hello, world! + +Finally, we get to the culmination of all my work: a "Hello, world!" program. I like using this example because it does what I originally set out to achieve so succinctly: +* Creates cloud resources +* Allows you to quickly use what you just created +* Automatically sets the minimum required permissions + +```ts +import { Bucket } from 'synapse:srl/storage' +import { Function } from 'synapse:srl/compute' + +const bucket = new Bucket() + +const fn = new Function(async () => { + await bucket.put('hello.txt', 'hello, world!') +}) + +export async function main() { + await fn() + const data = await bucket.get('hello.txt', 'utf-8') + console.log(data) +} +``` + +In just two commands, you've deployed to AWS, invoked the function, and grabbed data from the bucket: +```shell +synapse deploy --target aws +synapse run +# hello, world! +``` + +## The Future + +Synapse is still woefully incomplete. It can only deploy to AWS. Some code still fails to compile. But that's okay. Because it will improve. Because, unlike a year ago, I now have a much clearer idea for what needs to be done. + +Thank you for taking the time to read this! If anyone has questions or would like more detailed technical explanations, I'd be happy to share! I tried to keep things somewhat digestible by not covering all the challenges and capabilities of Synapse. diff --git a/integrations/aws/src/services/api-gateway.ts b/integrations/aws/src/services/api-gateway.ts index 90c6b89..38f26f0 100644 --- a/integrations/aws/src/services/api-gateway.ts +++ b/integrations/aws/src/services/api-gateway.ts @@ -32,8 +32,16 @@ export class Gateway { // the below are required without an OpenApi spec name: generateIdentifier(aws.Apigatewayv2Api, 'name', 62), protocolType: 'HTTP', // | WEBSOCKET - disableExecuteApiEndpoint: domain !== undefined, + + // TODO: we only want to disable this endpoint for gateways that + // start with a domain rather than when a domain is added. + // + // Otherwise this becomes a backwards incompatible change which we + // should strive to avoid whenever possible. + // + // disableExecuteApiEndpoint: domain !== undefined, }) + this.resource = apig const stageName: string = '$default' const stage = new aws.Apigatewayv2Stage({ @@ -61,7 +69,6 @@ export class Gateway { public addRoute

( route: P, handler: HttpHandler | HttpHandler, - opt?: { rawBody?: boolean } ): HttpRoute<[...PathArgs

, U], R> { const [method, path] = route.split(' ') if (path === undefined) { @@ -69,7 +76,7 @@ export class Gateway { } const authHandler = typeof this.props?.auth === 'function' ? this.props.auth : undefined - const wrapped = wrapHandler(handler as any, authHandler as any, opt?.rawBody, this.middleware, this.props?.allowedOrigins) + const wrapped = wrapHandler(handler as any, authHandler as any, this.middleware, this.props?.allowedOrigins) const mergeHandlers = this.props?.mergeHandlers ?? true if (mergeHandlers) { if (!this.requestRouter) { @@ -352,10 +359,18 @@ async function runHandler(fn: () => Promise | T): Promise) { + const contentType = headers['content-type'] || headers['Content-Type'] // TODO: check if the headers are already normalized + if (!contentType) { + return false + } + + return !!contentType.match(/application\/(?:([^+\s]+)\+)?json/) +} + function wrapHandler( handler: HttpHandler, authHandler?: HttpHandler, - raw = false, middleware: Middleware[] = [], allowedOrigins?: string[] ) { @@ -363,7 +378,7 @@ function wrapHandler( const decoded = (request.body !== undefined && request.isBase64Encoded) ? Buffer.from(request.body, 'base64').toString('utf-8') : request.body - const body = (decoded && !raw) ? JSON.parse(decoded) : decoded + const body = (decoded && isJsonRequest(request.headers)) ? JSON.parse(decoded) : decoded const stage = request.requestContext.stage const trimmedPath = request.rawPath.replace(`/${stage}`, '') diff --git a/integrations/aws/src/services/s3.ts b/integrations/aws/src/services/s3.ts index dca90e8..f2c68d2 100644 --- a/integrations/aws/src/services/s3.ts +++ b/integrations/aws/src/services/s3.ts @@ -92,8 +92,18 @@ export class Bucket implements storage.Bucket { return { key: key, uploadId: resp.UploadId! } } - public async completeMultipartUpload(uploadId: string, key: string) { - await this.client.completeMultipartUpload({ Bucket: this.resource.bucket, Key: key, UploadId: uploadId }) + public async completeMultipartUpload(uploadId: string, key: string, parts: string[]) { + await this.client.completeMultipartUpload({ + Bucket: this.resource.bucket, + Key: key, + UploadId: uploadId, + MultipartUpload: { + Parts: parts.map((p, i) => ({ + PartNumber: i + 1, + ETag: p, + })), + }, + }) } public async getMultipartUploadSignedUrls(uploadId: string, key: string, numParts: number, expiresIn = 3600) { @@ -104,7 +114,8 @@ export class Bucket implements storage.Bucket { const baseReq: Omit = { uploadId, bucket: this.resource.bucket, key } const urls: Promise[] = [] for (let i = 0; i < numParts; i++) { - urls.push(_getUploadPartSignedUrl(this.client, { ...baseReq, part: i + 1 }, expiresIn)) + const url = _getUploadPartSignedUrl(this.client, { ...baseReq, part: i + 1 }, expiresIn) + urls.push(url) } return Promise.all(urls) diff --git a/integrations/local/src/gateway.ts b/integrations/local/src/gateway.ts index 10f8882..7337e70 100644 --- a/integrations/local/src/gateway.ts +++ b/integrations/local/src/gateway.ts @@ -331,14 +331,13 @@ export class Gateway implements compute.HttpService { public addRoute

( route: P, handler: HttpHandler | HttpHandler, - opt?: { rawBody?: boolean } ): HttpRoute<[...PathArgs

, U], R> { const [method, path] = route.split(' ') if (path === undefined) { throw new Error(`Missing method in route: ${route}`) } - this.requestRouter.addRoute(route, wrapHandler(handler, this.authHandler, opt?.rawBody)) + this.requestRouter.addRoute(route, wrapHandler(handler, this.authHandler)) const pathBindings = createPathBindings(path) @@ -463,13 +462,21 @@ function sendResponse(response: http.ServerResponse, data?: any, headers?: Heade }) } +function isJsonRequest(headers: Headers) { + const contentType = headers.get('content-type') + if (!contentType) { + return false + } + + return !!contentType.match(/application\/(?:([^+\s]+)\+)?json/) +} + function wrapHandler( handler: HttpHandler, authHandler?: HttpHandler, - raw = false, ) { async function handleRequest(req: HttpRequest, data?: string) { - const body = (data && !raw) ? JSON.parse(data) : data + const body = (data && isJsonRequest(req.headers)) ? JSON.parse(data) : data if (authHandler) { const resp = await authHandler(req, body) diff --git a/src/build/sea.ts b/src/build/sea.ts index 7b8fe4f..580f3ea 100644 --- a/src/build/sea.ts +++ b/src/build/sea.ts @@ -1,5 +1,5 @@ import * as path from 'node:path' -import { getFs } from '../execution' +import { getFs, getSelfPath } from '../execution' import { runCommand } from '../utils/process' import { getHash, makeExecutable, throwIfNotFileNotFoundError } from '../utils' import { createRequire } from 'node:module' @@ -19,14 +19,27 @@ interface Postject { remove?: (filename: string, resourceName: string, options?: PostjectOptions) => Promise } +function loadFromRelPath() { + const selfPath = getSelfPath() + if (!selfPath) { + return + } + + return createRequire(selfPath)('./postject') +} + function getPostject(): Postject { // This is so we can still load `postject` as an SEA const loadFromCwd = () => createRequire(path.resolve(process.cwd(), 'package.json'))('postject') try { - return require('postject') + return loadFromRelPath() } catch { - return loadFromCwd() + try { + return require('postject') + } catch { + return loadFromCwd() + } } } diff --git a/src/cli/buildInternal.ts b/src/cli/buildInternal.ts index 5e7b588..e91f341 100644 --- a/src/cli/buildInternal.ts +++ b/src/cli/buildInternal.ts @@ -565,6 +565,19 @@ export async function createPackageForRelease(pkgDir: string, dest: string, targ return true } + async function maybeCopyPostjectBundle() { + const postject = path.resolve(dest, 'node_modules', 'postject', 'dist', 'api.js') + const data = await getFs().readFile(postject).catch(throwIfNotFileNotFoundError) + if (!data) { + getLogger().warn('Package `postject` not found') + return + } + + await getFs().writeFile(path.resolve(dest, 'dist', 'postject.js'), data) + } + + await maybeCopyPostjectBundle() + if (await maybeCopyEsbuildBinary()) { await getFs().deleteFile(path.resolve(dest, 'node_modules', '@esbuild')).catch(throwIfNotFileNotFoundError) diff --git a/src/cli/commands.ts b/src/cli/commands.ts index d12acff..8e81f9f 100644 --- a/src/cli/commands.ts +++ b/src/cli/commands.ts @@ -912,9 +912,13 @@ registerTypedCommand( registerTypedCommand( 'build', { - hidden: true, + description: 'Builds all executables in the current program.', args: [{ name: 'target', type: typescriptFileType, allowMultiple: true }], - options: [{ name: 'lazy-load', type: 'string', allowMultiple: true }, { name: 'no-sea', type: 'boolean' }], + options: [ + { name: 'lazy-load', type: 'string', allowMultiple: true }, + { name: 'no-sea', type: 'boolean' }, + { name: 'synapse-path', type: 'string' } + ], }, (...args) => { const [files, opt] = unpackArgs(args) @@ -922,6 +926,7 @@ registerTypedCommand( return synapse.buildExecutables(files, { sea: !opt['no-sea'], lazyLoad: opt['lazy-load'], + synapsePath: opt['synapse-path'], }) }, ) @@ -1200,7 +1205,7 @@ registerTypedCommand( // isImportantCommand: true, description: 'Creates a new package in the current directory', options: [ - { name: 'template', type: createEnumType('hello-world', 'react') } + { name: 'template', type: 'string' } ] }, (opt) => synapse.init(opt) diff --git a/src/git.ts b/src/git.ts index 37e72d5..ea20a09 100644 --- a/src/git.ts +++ b/src/git.ts @@ -130,12 +130,21 @@ export function getCurrentBranchSync(dir = process.cwd()) { export async function openRemote(remote: string) { const dest = path.resolve(getGitDirectory(), 'remotes', remote) - await mkdir(dest, { recursive: true }) - const cloneResult = await runCommand( - 'git', - ['clone', '--depth', '1', '--no-checkout', '--no-tags', '--filter=blob:none', remote, dest] - ) + if (!(await getFs().fileExists(dest))) { + await mkdir(dest, { recursive: true }) + + const cloneResult = await runCommand( + 'git', + ['clone', '--depth', '1', '--no-checkout', '--no-tags', '--filter=blob:none', remote, dest] + ) + } else { + await runCommand( + 'git', + ['fetch', '--depth', '1', '--no-tags', '--filter=blob:none', remote], + { cwd: dest } + ) + } const treeResult = await runCommand( 'git', diff --git a/src/index.ts b/src/index.ts index 68c597f..e66157f 100644 --- a/src/index.ts +++ b/src/index.ts @@ -61,6 +61,7 @@ import { homedir } from 'node:os' import { createBlock, openBlock } from './build-fs/block' import { seaAssetPrefix } from './bundler' import { buildWindowsShim } from './zig/compile' +import { openRemote } from './git' export { runTask, getLogger } from './logging' @@ -2012,11 +2013,26 @@ export async function schemas(type?: string, opt?: CombinedOptions) { } } +const examplesRepoUrl = 'https://github.com/Cohesible/synapse' +async function initFromRepo(name: string, dest: string) { + const repo = await openRemote(examplesRepoUrl) + const prefix = `examples/${name}/` + const files = repo.files.filter(f => f.name.startsWith(prefix)) + if (files.length === 0) { + throw new Error(`No example found named "${name}"`) + } + + await Promise.all(files.map(async f => getFs().writeFile( + path.resolve(dest, f.name.slice(prefix.length)), + await f.read() + ))) + + await repo.dispose() + + return files.map(f => f.name.slice(prefix.length)) +} -// This inits a new package w/ scaffolding -// `initWorkspace` initializes a non-empty directory -// TODO: we should clone from GitHub -export async function init(opt?: { template?: 'hello-world' | 'react' }) { +export async function init(opt?: { template?: string }) { const fs = getFs() const dir = process.cwd() const dirFiles = (await fs.readDirectory(dir)).filter(f => f.name !== '.git') @@ -2037,9 +2053,7 @@ export async function init(opt?: { template?: 'hello-world' | 'react' }) { process.env['AWS_ROLE_ARN'] ) } - - const probablyHasAwsCredentials = await detectAwsCredentials() - + printLine(colorize('green', `Created files:`)) for (const f of filesCreated) { printLine(colorize('green', ` ${f}`)) @@ -2048,6 +2062,10 @@ export async function init(opt?: { template?: 'hello-world' | 'react' }) { printLine(colorize('gray', '"node_modules" was created for better editor support')) } printLine() + + if (filesCreated.find(f => f === 'README.md')) { + return + } const deployCmd = renderCmdSuggestion('deploy') const targetOption = colorize('gray', '--target aws') @@ -2056,6 +2074,8 @@ export async function init(opt?: { template?: 'hello-world' | 'react' }) { printLine() printLine(`By default, your code is built for and deployed to a "local" target.`) + const probablyHasAwsCredentials = await detectAwsCredentials() + if (probablyHasAwsCredentials) { printLine(`You can target AWS by adding ${targetOption} to a compile or deploy command.`) printLine(`The target is remembered for subsequent commands.`) @@ -2082,12 +2102,9 @@ export async function init(opt?: { template?: 'hello-world' | 'react' }) { await getFs().writeFile(path.resolve(dir, 'package.json'), JSON.stringify(pkg, undefined, 4)) const text = 'aW1wb3J0IHsgU3VzcGVuc2UsIHVzZVJlZiB9IGZyb20gJ3JlYWN0JwppbXBvcnQgeyBCdWNrZXQgfSBmcm9tICdzeW5hcHNlOnNybC9zdG9yYWdlJwppbXBvcnQgeyBjcmVhdGVXZWJzaXRlIH0gZnJvbSAnQGNvaGVzaWJsZS9zeW5hcHNlLXJlYWN0JwppbXBvcnQgeyB1c2VTZXJ2ZXIsIG9wZW5Ccm93c2VyIH0gZnJvbSAnQGNvaGVzaWJsZS9zeW5hcHNlLXdlYnNpdGVzJwoKY29uc3Qgd2Vic2l0ZSA9IGNyZWF0ZVdlYnNpdGUoKQpjb25zdCBidWNrZXQgPSBuZXcgQnVja2V0KCkKCmNvbnN0IGdldERhdGEgPSAoa2V5OiBzdHJpbmcpID0+IHsKICAgIHJldHVybiBidWNrZXQuZ2V0KGtleSwgJ3V0Zi04JykuY2F0Y2goZSA9PiB7CiAgICAgICAgcmV0dXJuIChlIGFzIGFueSkubWVzc2FnZQogICAgfSkKfQoKZnVuY3Rpb24gQnVja2V0Q29udGVudHMocHJvcHM6IHsgYnVja2V0S2V5OiBzdHJpbmcgfSkgewogICAgY29uc3QgZGF0YSA9IHVzZVNlcnZlcihnZXREYXRhLCBwcm9wcy5idWNrZXRLZXkpCgogICAgcmV0dXJuIDxwcmU+e2RhdGF9PC9wcmU+Cn0KCmZ1bmN0aW9uIEJ1Y2tldFBhZ2UocHJvcHM6IHsgYnVja2V0S2V5OiBzdHJpbmcgfSkgewogICAgcmV0dXJuICgKICAgICAgICA8ZGl2PgogICAgICAgICAgICA8U3VzcGVuc2UgZmFsbGJhY2s9ezxkaXY+bG9hZGluZzwvZGl2Pn0+CiAgICAgICAgICAgICAgICA8QnVja2V0Q29udGVudHMgYnVja2V0S2V5PXtwcm9wcy5idWNrZXRLZXl9Lz4KICAgICAgICAgICAgPC9TdXNwZW5zZT4KICAgICAgICA8L2Rpdj4KICAgICkKfQoKZnVuY3Rpb24gUm9vdExheW91dCh7IGNoaWxkcmVuIH06IHsgY2hpbGRyZW46IEpTWC5FbGVtZW50IHwgSlNYLkVsZW1lbnRbXSB9KSB7CiAgICByZXR1cm4gKAogICAgICAgIDxodG1sIGxhbmc9ImVuIj4KICAgICAgICAgICAgPGhlYWQ+PC9oZWFkPgogICAgICAgICAgICA8Ym9keT57Y2hpbGRyZW59PC9ib2R5PgogICAgICAgIDwvaHRtbD4KICAgICkKfQoKY29uc3QgYWRkRGF0YSA9IHdlYnNpdGUuYmluZChhc3luYyAoa2V5OiBzdHJpbmcsIGRhdGE6IHN0cmluZykgPT4gewogICAgYXdhaXQgYnVja2V0LnB1dChrZXksIGRhdGEpCn0pCgpmdW5jdGlvbiBCdWNrZXRGb3JtVGhpbmcoKSB7CiAgICBjb25zdCBrZXlSZWYgPSB1c2VSZWY8SFRNTElucHV0RWxlbWVudD4oKQogICAgY29uc3QgdmFsdWVSZWYgPSB1c2VSZWY8SFRNTElucHV0RWxlbWVudD4oKQoKICAgIGZ1bmN0aW9uIHN1Ym1pdCgpIHsKICAgICAgICBjb25zdCBrZXkgPSBrZXlSZWYuY3VycmVudC52YWx1ZQogICAgICAgIGNvbnN0IHZhbHVlID0gdmFsdWVSZWYuY3VycmVudC52YWx1ZQoKICAgICAgICBhZGREYXRhKGtleSwgdmFsdWUpLnRoZW4oKCkgPT4gewogICAgICAgICAgICB3aW5kb3cubG9jYXRpb24gPSB3aW5kb3cubG9jYXRpb24KICAgICAgICB9KQogICAgfQoKICAgIHJldHVybiAoCiAgICAgICAgPGRpdj4KICAgICAgICAgICAgPGxhYmVsPgogICAgICAgICAgICAgICAgS2V5CiAgICAgICAgICAgICAgICA8aW5wdXQgdHlwZT0ndGV4dCcgcmVmPXtrZXlSZWZ9PjwvaW5wdXQ+CiAgICAgICAgICAgIDwvbGFiZWw+CiAgICAgICAgICAgIDxsYWJlbD4KICAgICAgICAgICAgICAgIFZhbHVlCiAgICAgICAgICAgICAgICA8aW5wdXQgdHlwZT0ndGV4dCcgcmVmPXt2YWx1ZVJlZn0+PC9pbnB1dD4KICAgICAgICAgICAgPC9sYWJlbD4KICAgICAgICAgICAgPGJ1dHRvbiBvbkNsaWNrPXtzdWJtaXR9IHN0eWxlPXt7IG1hcmdpbkxlZnQ6ICcxMHB4JyB9fT5BZGQgSXRlbTwvYnV0dG9uPgogICAgICAgIDwvZGl2PgogICAgKQp9Cgphc3luYyBmdW5jdGlvbiBnZXRJdGVtcygpIHsKICAgIHJldHVybiBhd2FpdCBidWNrZXQubGlzdCgpCn0KCmNvbnN0IGRvRGVsZXRlID0gd2Vic2l0ZS5iaW5kKChrZXk6IHN0cmluZykgPT4gYnVja2V0LmRlbGV0ZShrZXkpKQoKZnVuY3Rpb24gQnVja2V0SXRlbShwcm9wczogeyBidWNrZXRLZXk6IHN0cmluZyB9KSB7CiAgICBjb25zdCBrID0gcHJvcHMuYnVja2V0S2V5CgogICAgZnVuY3Rpb24gZGVsZXRlSXRlbSgpIHsKICAgICAgICBkb0RlbGV0ZShrKS50aGVuKCgpID0+IHsKICAgICAgICAgICAgd2luZG93LmxvY2F0aW9uID0gd2luZG93LmxvY2F0aW9uCiAgICAgICAgfSkKICAgIH0KCiAgICByZXR1cm4gKAogICAgICAgIDxsaT4KICAgICAgICAgICAgPGRpdiBzdHlsZT17eyBkaXNwbGF5OiAnZmxleCcsIG1heFdpZHRoOiAnMjUwcHgnLCBtYXJnaW5Cb3R0b206ICcxMHB4JyB9fT4KICAgICAgICAgICAgICAgIDxhIGhyZWY9e2AvYnVja2V0LyR7a31gfSBzdHlsZT17eyBmbGV4OiAnZml0LWNvbnRlbnQnLCBhbGlnblNlbGY6ICdmbGV4LXN0YXJ0JyB9fT57a308L2E+CiAgICAgICAgICAgICAgICA8YnV0dG9uIG9uQ2xpY2s9e2RlbGV0ZUl0ZW19IHN0eWxlPXt7IGFsaWduU2VsZjogJ2ZsZXgtZW5kJyB9fT5EZWxldGU8L2J1dHRvbj4KICAgICAgICAgICAgPC9kaXY+CiAgICAgICAgPC9saT4KICAgICkKfQoKZnVuY3Rpb24gSXRlbUxpc3QoKSB7CiAgICBjb25zdCBpdGVtcyA9IHVzZVNlcnZlcihnZXRJdGVtcykKCiAgICBpZiAoaXRlbXMubGVuZ3RoID09PSAwKSB7CiAgICAgICAgcmV0dXJuIDxkaXY+PGI+VGhlcmUncyBub3RoaW5nIGluIHRoZSBidWNrZXQhPC9iPjwvZGl2PgogICAgfQoKICAgIHJldHVybiAoCiAgICAgICAgPHVsPgogICAgICAgICAgICB7aXRlbXMubWFwKGsgPT4gPEJ1Y2tldEl0ZW0ga2V5PXtrfSBidWNrZXRLZXk9e2t9Lz4pfQogICAgICAgIDwvdWw+CiAgICApCn0KCmZ1bmN0aW9uIEhvbWVQYWdlKCkgewogICAgcmV0dXJuICgKICAgICAgICA8ZGl2PgogICAgICAgICAgICA8QnVja2V0Rm9ybVRoaW5nPjwvQnVja2V0Rm9ybVRoaW5nPgogICAgICAgICAgICA8YnI+PC9icj4KICAgICAgICAgICAgPFN1c3BlbnNlIGZhbGxiYWNrPSdsb2FkaW5nJz4KICAgICAgICAgICAgICAgIDxJdGVtTGlzdC8+CiAgICAgICAgICAgIDwvU3VzcGVuc2U+CiAgICAgICAgPC9kaXY+CiAgICApCn0KCndlYnNpdGUuYWRkUGFnZSgnLycsIHsKICAgIGNvbXBvbmVudDogSG9tZVBhZ2UsCiAgICBsYXlvdXQ6IHsgY29tcG9uZW50OiBSb290TGF5b3V0IH0sCn0pCiAgICAKCndlYnNpdGUuYWRkUGFnZSgnL2J1Y2tldC97YnVja2V0S2V5fScsIHsKICAgIGNvbXBvbmVudDogQnVja2V0UGFnZSwKICAgIGxheW91dDogeyBjb21wb25lbnQ6IFJvb3RMYXlvdXQgfSwKfSkKCmV4cG9ydCBhc3luYyBmdW5jdGlvbiBtYWluKCkgewogICAgb3BlbkJyb3dzZXIod2Vic2l0ZS51cmwpCn0KCg==' await getFs().writeFile(path.resolve(dir, 'app.tsx'), Buffer.from(text, 'base64')) - await showInstructions(['app.tsx', 'package.json', 'tsconfig.json']) - - return - } - - const text = ` + await showInstructions(['app.tsx', 'package.json', 'tsconfig.json']) + } else if (!opt?.template) { + const text = ` import { Function } from 'synapse:srl/compute' const hello = new Function(() => { @@ -2097,11 +2114,15 @@ const hello = new Function(() => { export async function main(...args: string[]) { console.log(await hello()) } -`.trimStart() - - await fs.writeFile(path.resolve(dir, 'hello.ts'), text, { flag: 'wx' }) - - await showInstructions(['hello.ts']) + `.trimStart() + + await fs.writeFile(path.resolve(dir, 'hello.ts'), text, { flag: 'wx' }) + + await showInstructions(['hello.ts']) + } else { + const files = await initFromRepo(opt.template, dir) + await showInstructions(files) + } } export async function clearCache(targetKey?: string, opt?: CombinedOptions) { @@ -2640,6 +2661,7 @@ export function runUserScript(target: string) { interface BuildExecutableOpt { readonly sea?: boolean readonly lazyLoad?: string[] + readonly synapsePath?: string } export async function buildExecutables(targets: string[], opt: BuildExecutableOpt) { @@ -2656,6 +2678,10 @@ export async function buildExecutables(targets: string[], opt: BuildExecutableOp } async function _getNodePath() { + if (opt.synapsePath) { + return opt.synapsePath + } + if (isSelfSea()) { return process.execPath } @@ -2676,24 +2702,38 @@ export async function buildExecutables(targets: string[], opt: BuildExecutableOp const external = ['esbuild', 'typescript', 'postject'] // XXX: this is hard-coded to `synapse` const bundleOpt = pkg.data.name === 'synapse' ? { + sea: opt.sea, external, lazyLoad: ['@cohesible/*', 'typescript', 'esbuild', ...lazyNodeModules], extraBuiltins: ['typescript', 'esbuild'], - } : opt + } : { + ...opt, + runtimeExecutable: opt.synapsePath, + } if (pkg.data.name === 'synapse') { process.env.SKIP_SEA_MAIN = '1' process.env.CURRENT_PACKAGE_DIR = pkg.directory } + const config = (await getResolvedTsConfig())?.options + const outDir = config?.outDir ?? 'out' + for (const [k, v] of Object.entries(bin)) { const resolved = path.resolve(bt.workingDirectory, v) if (!set.has(resolved)) continue - const res = await bundleExecutable(bt, resolved, undefined, undefined, { sea: opt.sea, ...bundleOpt }) - const dest = path.resolve(bt.workingDirectory, 'dist', 'bin', k) + const outfile = !config?.outDir ? path.resolve(bt.workingDirectory, 'out', v) : undefined + + const res = await bundleExecutable(bt, resolved, outfile, undefined, bundleOpt) + const dest = path.resolve(bt.workingDirectory, outDir, 'bin', k) + if (opt.sea) { - await makeSea(res.outfile, await getNodePath(), dest, res.assets) + try { + await makeSea(res.outfile, await getNodePath(), dest, res.assets) + } finally { + await getFs().deleteFile(res.outfile) + } } else { // TODO: write out assets } diff --git a/src/runtime/srl/compute/index.ts b/src/runtime/srl/compute/index.ts index 8d15d41..a223d88 100644 --- a/src/runtime/srl/compute/index.ts +++ b/src/runtime/srl/compute/index.ts @@ -85,22 +85,10 @@ export declare class HttpService { /** @internal */ forward(req: HttpRequest, body: any): Promise - //# resource = true - /** @internal */ - addRoute( - route: U, - handler: HttpHandler, - opt: { rawBody: true } - ): HttpRoute<[...PathArgs, string], R> - //# resource = true addRoute

( route: P, - handler: HttpHandler, - opt?: { - /** @internal */ - rawBody?: boolean - } + handler: HttpHandler ): HttpRoute<[...PathArgs

, ...(U extends undefined ? [] : [body: U])], R> /** TODO */ @@ -109,18 +97,6 @@ export declare class HttpService { // ): any } -// //# resource = true -// export declare class WebSocketService { -// readonly invokeUrl: string -// constructor(opt?: HttpServiceOptions) - -// addWebSocketRoute

( -// route: P, -// handler: HttpHandler, -// opt?: { rawBody?: boolean } -// ): HttpRoute<[...PathArgs

, ...(U extends undefined ? [] : [body: U])], R> -// } - /** @internal */ export interface ContainerInstance { readonly name: string