Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

settings schema.json #1874

Merged
merged 9 commits into from
Feb 3, 2025
Merged

settings schema.json #1874

merged 9 commits into from
Feb 3, 2025

Conversation

oneirocosm
Copy link
Member

Adds schema.json support to the settings file to provide type hints and other eventual details. This also adds a system to easily add more schema files for other type of configurations.

This adds a version of the schema that relies on hard-coded data but
shows that this is a viable way to implement it
This makes it easier to load multiple schema files and configure them
with specific filepaths.
This generates the schema files and adds them to the server as a build
step.
This may fix the light mode monaco bug along with schema initialization.
Copy link
Contributor

coderabbitai bot commented Jan 30, 2025

Walkthrough

This pull request introduces several enhancements to the project's configuration and schema management system. Key changes include the addition of a new task in Taskfile.yml for schema generation, which is now integrated into the build process. A new command-line application, main-generateschema.go, has been created to generate a JSON schema based on the SettingsType structure, which is then served via a new HTTP endpoint established in the schema package.

Modifications to the frontend code editor include updates to the loadMonaco function for asynchronous loading, improved handling of remote files, and integration of schema-based validation through a new schemaendpoints.ts file. Additionally, a new JSON schema file, settings.json, has been introduced to define the structure and validation rules for application settings. Collectively, these changes enhance the application's configuration management and editor functionality.

✨ Finishing Touches
  • 📝 Generate Docstrings (Beta)

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.
    • @coderabbitai help me debug CodeRabbit configuration file.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR. (Beta)
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 5

🧹 Nitpick comments (11)
frontend/app/view/codeeditor/codeeditor.tsx (5)

53-63: Defining the dark theme
These theme definitions provide a polished, custom look. Ensure color choices are tested across different display profiles for readability.


73-76: YAML configuration
Enabling YAML validation is a strong start. If additional YAML schemas are needed in the future, this is the right spot to configure them.


81-81: Concurrent schema fetching
Splitting schema retrieval among multiple endpoints is efficient. A fallback or partial handling could be beneficial if one endpoint fails.


82-87: JSON diagnostics with schema
Enabling JSON schema validation here is an excellent enhancement for the editor. If comments need to be allowed in JSON files, consider toggling allowComments:true.


154-156: Logging of absPath
Console logging can be noisy in production. Wrap this in a development flag or remove it if no longer needed.

frontend/app/view/codeeditor/schemaendpoints.ts (1)

7-11: Type definition for EndpointInfo
Clear structure for storing schema info. Adding inline documentation might help future maintainers understand each field.

cmd/generateschema/main-generateschema.go (2)

17-17: Named constant for schema path
Using a named constant boosts clarity and maintainability. Expand this system if more schema files are generated later.


19-39: Main function for schema generation
Reflect-based schema generation is a practical solution.

  • Error handling is appropriate for a CLI tool.
  • If you intend to add more schema files, consider parameterizing the process.
pkg/schema/schema.go (1)

33-38: Consider adding more security headers.

While setting Content-Type is good, consider adding security headers like:

  • X-Content-Type-Options: nosniff
  • X-Frame-Options: DENY
 func addHeaders(next http.Handler) http.Handler {
 	return http.HandlerFunc(func(w http.ResponseWriter, r *http.Request) {
 		w.Header().Add("Content-Type", "application/schema+json")
+		w.Header().Add("X-Content-Type-Options", "nosniff")
+		w.Header().Add("X-Frame-Options", "DENY")
 		next.ServeHTTP(w, r)
 	})
 }
schema/settings.json (2)

1-4: Enhance schema metadata for better documentation.

Consider adding the following metadata fields to improve schema documentation:

  • title: A human-readable title for the schema
  • description: A detailed description of the schema's purpose
  • $comment: Additional implementation notes or usage examples
 {
   "$schema": "https://json-schema.org/draft/2020-12/schema",
   "$id": "https://github.com/wavetermdev/waveterm/pkg/wconfig/settings-type",
+  "title": "Wave Terminal Settings",
+  "description": "Configuration schema for Wave Terminal settings",
+  "$comment": "This schema defines all available configuration options for Wave Terminal",
   "$ref": "#/$defs/SettingsType",

23-25: Add enums for properties with predefined values.

The following string properties would benefit from enum constraints to ensure valid values:

  • ai:apitype
  • term:theme
  • autoupdate:channel

Example for theme:

   "term:theme": {
-    "type": "string"
+    "type": "string",
+    "enum": ["light", "dark", "system"],
+    "default": "system"
   },

Also applies to: 65-67, 134-136

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 5db84e5 and 50766b0.

⛔ Files ignored due to path filters (1)
  • go.sum is excluded by !**/*.sum
📒 Files selected for processing (9)
  • Taskfile.yml (3 hunks)
  • cmd/generateschema/main-generateschema.go (1 hunks)
  • frontend/app/view/codeeditor/codeeditor.tsx (5 hunks)
  • frontend/app/view/codeeditor/schemaendpoints.ts (1 hunks)
  • frontend/wave.ts (1 hunks)
  • go.mod (4 hunks)
  • pkg/schema/schema.go (1 hunks)
  • pkg/web/web.go (3 hunks)
  • schema/settings.json (1 hunks)
⏰ Context from checks skipped due to timeout of 90000ms (5)
  • GitHub Check: Analyze (javascript-typescript)
  • GitHub Check: Analyze (go)
  • GitHub Check: merge-gatekeeper
  • GitHub Check: Build Docsite
  • GitHub Check: Build for TestDriver.ai
🔇 Additional comments (21)
frontend/app/view/codeeditor/codeeditor.tsx (9)

12-14: Good addition of remote file utility imports
These imports are needed for remote file path resolution. The usage aligns well with the new remote file join functionality in the subsequent code.


20-20: Separation of schema logic
Importing SchemaEndpoints and getSchemaEndpointInfo organizes schema logic in a separate file, making this code more maintainable.


49-49: Refactor to async function
Switching loadMonaco to an async function clarifies the flow with concise await usage, improving readability.


51-52: Await loader initialization
Using await loader.init() is more readable than promise chaining; consider logging or error handling if initialization fails.


64-72: Defining the light theme
Likewise, the light theme appears consistent with the dark variant. Great job maintaining parallel structures for theming.


78-80: Disabling TypeScript diagnostics
Preventing semantic validation can mask legitimate issues if you rely solely on external checks. Confirm that this setting aligns with your development strategy.


129-129: New state variable: absPath
Storing a remote-derived file path in state is a solid approach. This helps keep the UI in sync with RPC-driven path resolution.


140-152: Remote file path resolution effect
Using RpcApi.RemoteFileJoinCommand to derive the absolute path is effective. Consider gracefully handling or surfacing errors if the command fails (e.g., user feedback).


188-188: Binding path prop to absPath
Passing absPath to the editor ensures the correct file model is loaded. This is a key improvement for remote file support.

frontend/app/view/codeeditor/schemaendpoints.ts (5)

1-2: Consistent license header
The license header matches the project’s conventions.


4-5: API and endpoint imports
Straightforward imports. Confirm these references remain stable if endpoint locations or definitions change downstream.


13-14: Mapping filepaths to endpoints
Using a Map to associate URIs with local file paths is elegantly minimal. Consider dynamic population if new schemas are introduced.


30-30: Deriving schema endpoint array
Creating the array of schema endpoints from map keys is concise. Good approach for scalability.


32-32: Exporting schema utilities
Exporting both getSchemaEndpointInfo and SchemaEndpoints keeps the module organized for broad usage across the codebase.

cmd/generateschema/main-generateschema.go (1)

1-2: License header
Matches the standard project license.

pkg/schema/schema.go (1)

17-31: Consider adding error handling for edge cases in GetSchemaHandler.

The initialization logic could be more robust:

  1. No mutex protection for concurrent initialization of schemaHandler
  2. No validation of schemaStaticPath for potential path traversal

Run this script to check for potential path traversal vulnerabilities:

#!/bin/bash
# Description: Check for path traversal vulnerabilities in schema path handling

# Test: Search for path validation in schema handling
rg -A 5 'filepath\.Clean|filepath\.Rel|filepath\.Join' pkg/schema/
frontend/wave.ts (1)

186-186: Verify the impact of synchronous Monaco loading.

The change from setTimeout to await loadMonaco() could impact the initialization sequence. Ensure that:

  1. All required dependencies are available before Monaco loads
  2. No UI blocking occurs during initialization
#!/bin/bash
# Description: Check for Monaco editor initialization dependencies

# Test: Search for Monaco editor dependencies and initialization
rg -A 5 'loadMonaco|monaco\.editor' frontend/
pkg/web/web.go (1)

Line range hint 413-423: LGTM! Clean integration of schema handler.

The schema route is properly integrated using the standard pattern used for other routes in the application.

go.mod (1)

21-21: ```shell
#!/bin/bash

Check Go version requirements in go.mod and repository

echo "=== Checking go.mod for Go version ==="
cat go.mod | grep -E "^go "

echo -e "\n=== Checking repository for version info ==="

Look for Go version mentions in README or docs

fd -e md -e txt | xargs grep -l "go 1." | while read -r file; do
echo "=== $file ==="
cat "$file" | grep -E "go 1.[0-9]+"
done


</details>
<details>
<summary>schema/settings.json (1)</summary>

`228-230`: **LGTM! Strict schema validation is enforced.**

The schema correctly prevents additional properties and enforces object type.

</details>
<details>
<summary>Taskfile.yml (1)</summary>

`28-28`: **LGTM! Schema build task is correctly integrated.**

The build:schema task is appropriately added as a dependency to electron:dev, electron:start, and package tasks.



Also applies to: 42-42, 112-112

</details>

</blockquote></details>

</details>

<!-- This is an auto-generated comment by CodeRabbit for review status -->

Comment on lines 16 to 28
async function getSchemaEndpointInfo(endpoint: string): Promise<EndpointInfo> {
const data = await fetch(endpoint);
const fullSchema: object = await data.json();
const schemaRef = fullSchema?.["$ref"];
const schema = fullSchema?.[schemaRef];
const fileMatch = allFilepaths.get(endpoint) ?? [];

return {
uri: endpoint,
fileMatch,
schema,
};
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Schema retrieval via fetch
Fetching and parsing the schema is sensible. You might want to handle fetching errors or invalid $ref references more robustly to avoid runtime surprises.

Comment on lines +44 to +54
func (d JsonDir) Open(name string) (http.File, error) {
// Try name as supplied
f, err := d.d.Open(name)
if os.IsNotExist(err) {
// Not found, try with .json
if f, err := d.d.Open(name + ".json"); err == nil {
return f, nil
}
}
return f, err
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Improve error handling in JsonDir.Open.

The error handling could be more explicit:

  1. Original error is lost when .json extension is tried
  2. No logging of file access attempts
 func (d JsonDir) Open(name string) (http.File, error) {
+	// Clean the path to prevent directory traversal
+	name = filepath.Clean(name)
+
 	// Try name as supplied
 	f, err := d.d.Open(name)
 	if os.IsNotExist(err) {
 		// Not found, try with .json
-		if f, err := d.d.Open(name + ".json"); err == nil {
+		if f, jsonErr := d.d.Open(name + ".json"); jsonErr == nil {
 			return f, nil
+		} else {
+			// Return original error to maintain context
+			return nil, err
 		}
 	}
 	return f, err
 }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
func (d JsonDir) Open(name string) (http.File, error) {
// Try name as supplied
f, err := d.d.Open(name)
if os.IsNotExist(err) {
// Not found, try with .json
if f, err := d.d.Open(name + ".json"); err == nil {
return f, nil
}
}
return f, err
}
func (d JsonDir) Open(name string) (http.File, error) {
// Clean the path to prevent directory traversal
name = filepath.Clean(name)
// Try name as supplied
f, err := d.d.Open(name)
if os.IsNotExist(err) {
// Not found, try with .json
if f, jsonErr := d.d.Open(name + ".json"); jsonErr == nil {
return f, nil
} else {
// Return original error to maintain context
return nil, err
}
}
return f, err
}

Comment on lines +44 to +46
"ai:maxtokens": {
"type": "number"
},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add constraints for numeric properties.

The following numeric properties should have appropriate min/max constraints to prevent invalid values:

  • ai:maxtokens
  • ai:timeoutms
  • term:transparency
  • window:opacity
  • window:magnifiedblockopacity
  • window:magnifiedblocksize

Example fix for transparency and opacity:

   "term:transparency": {
-    "type": "number"
+    "type": "number",
+    "minimum": 0,
+    "maximum": 1
   },
   // ... similar changes for other properties

Also applies to: 47-49, 86-88, 164-166, 188-190, 191-193

Comment on lines +29 to +31
"ai:apitoken": {
"type": "string"
},
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add security annotations for sensitive data.

The ai:apitoken property contains sensitive data and should be marked accordingly.

   "ai:apitoken": {
-    "type": "string"
+    "type": "string",
+    "writeOnly": true,
+    "contentMediaType": "text/plain",
+    "$comment": "This field contains sensitive data"
   },
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
"ai:apitoken": {
"type": "string"
},
"ai:apitoken": {
"type": "string",
"writeOnly": true,
"contentMediaType": "text/plain",
"$comment": "This field contains sensitive data"
},

Taskfile.yml Outdated
Comment on lines 120 to 131
build:schema:
desc: Build the schema for configuration.
sources:
generates:
- "dist/schema/**/*"
cmds:
- go run cmd/generateschema/main-generateschema.go
- '{{.RMRF}} "dist/schema"'
- task: copyfiles:'schema':'dist/schema'
deps:
- generate

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add source files to the build:schema task.

The sources field is empty, which means the task won't track changes to the schema generation source files.

     build:schema:
         desc: Build the schema for configuration.
         sources:
+            - "cmd/generateschema/*.go"
+            - "pkg/wconfig/*.go"
         generates:
             - "dist/schema/**/*"
         cmds:
             - go run cmd/generateschema/main-generateschema.go
             - '{{.RMRF}} "dist/schema"'
             - task: copyfiles:'schema':'dist/schema'
         deps:
             - generate
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
build:schema:
desc: Build the schema for configuration.
sources:
generates:
- "dist/schema/**/*"
cmds:
- go run cmd/generateschema/main-generateschema.go
- '{{.RMRF}} "dist/schema"'
- task: copyfiles:'schema':'dist/schema'
deps:
- generate
build:schema:
desc: Build the schema for configuration.
sources:
- "cmd/generateschema/*.go"
- "pkg/wconfig/*.go"
generates:
- "dist/schema/**/*"
cmds:
- go run cmd/generateschema/main-generateschema.go
- '{{.RMRF}} "dist/schema"'
- task: copyfiles:'schema':'dist/schema'
deps:
- generate

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 3

🧹 Nitpick comments (1)
frontend/app/view/codeeditor/schemaendpoints.ts (1)

7-11: Consider using a more specific type for the schema field.

Instead of using a generic object type for the schema field, consider defining a specific interface that represents the expected JSON schema structure. This would provide better type safety and documentation.

 type EndpointInfo = {
     uri: string;
     fileMatch: Array<string>;
-    schema: object;
+    schema: JSONSchema;
 };

+interface JSONSchema {
+    type: string;
+    properties?: Record<string, unknown>;
+    required?: string[];
+    // Add other JSON Schema properties as needed
+}
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 475436d and 29a734c.

📒 Files selected for processing (1)
  • frontend/app/view/codeeditor/schemaendpoints.ts (1 hunks)
🧰 Additional context used
🪛 Biome (1.9.4)
frontend/app/view/codeeditor/schemaendpoints.ts

[error] 17-17: Don't use 'Object' as a type.

Prefer explicitly define the object shape. This type means "any non-nullable value", which is slightly better than 'unknown', but it's still a broad type.

(lint/complexity/noBannedTypes)

⏰ Context from checks skipped due to timeout of 90000ms (4)
  • GitHub Check: Analyze (javascript-typescript)
  • GitHub Check: Analyze (go)
  • GitHub Check: merge-gatekeeper
  • GitHub Check: Build for TestDriver.ai
🔇 Additional comments (1)
frontend/app/view/codeeditor/schemaendpoints.ts (1)

21-22: 🛠️ Refactor suggestion

Add validation for schema reference.

The current implementation assumes the schema reference structure is valid. Consider adding validation to handle malformed schemas.

-        const schemaRef: string = fullSchema?.["$ref"];
-        schema = fullSchema?.[schemaRef];
+        const schemaRef: string | undefined = fullSchema?.["$ref"];
+        if (!schemaRef) {
+            throw new Error("Invalid schema: missing $ref");
+        }
+        const resolvedSchema = fullSchema?.[schemaRef];
+        if (!resolvedSchema) {
+            throw new Error(`Invalid schema: cannot resolve reference ${schemaRef}`);
+        }
+        schema = resolvedSchema;

Likely invalid or redundant comment.

allFilepaths.set(`${getWebServerEndpoint()}/schema/settings.json`, [`${getApi().getConfigDir()}/settings.json`]);

async function getSchemaEndpointInfo(endpoint: string): Promise<EndpointInfo> {
let schema: Object;
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Replace Object type with a more specific type.

As flagged by static analysis, using Object as a type is not recommended.

-    let schema: Object;
+    let schema: JSONSchema;
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
let schema: Object;
let schema: JSONSchema;
🧰 Tools
🪛 Biome (1.9.4)

[error] 17-17: Don't use 'Object' as a type.

Prefer explicitly define the object shape. This type means "any non-nullable value", which is slightly better than 'unknown', but it's still a broad type.

(lint/complexity/noBannedTypes)

Comment on lines +13 to +14
const allFilepaths: Map<string, Array<string>> = new Map();
allFilepaths.set(`${getWebServerEndpoint()}/schema/settings.json`, [`${getApi().getConfigDir()}/settings.json`]);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Codebase verification

Focus on Adding Validation for Configuration Paths

The search confirms that the configuration directory (via getApi().getConfigDir()) is used consistently across the codebase (in at least three locations) without any validation. Consider adding checks to ensure the returned paths are valid and handle cases where they might be undefined or malformed.

  • frontend/app/view/codeeditor/schemaendpoints.ts: Mapping the settings file without validation.
  • frontend/app/modals/conntypeahead.tsx: Path for connections.json is directly derived from the config without checks.
  • frontend/app/waveai/waveai.tsx: Path for ai.json is similarly derived without validation.
🔗 Analysis chain

Add validation for configuration paths.

The current implementation assumes the configuration directory and endpoints are always valid. Consider adding validation to handle cases where these paths might be undefined or invalid.

🏁 Scripts executed

The following scripts were executed for the analysis:

Script:

#!/bin/bash
# Description: Check if the configuration directory path is consistently used across the codebase

# Search for getConfigDir usage
rg "getConfigDir\(\)" -A 2

Length of output: 916

Comment on lines +18 to +26
try {
const data = await fetch(endpoint);
const fullSchema: object = await data.json();
const schemaRef: string = fullSchema?.["$ref"];
schema = fullSchema?.[schemaRef];
} catch (e) {
console.log("cannot find schema:", e);
schema = {};
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Improve error handling and add timeout for fetch.

The current error handling is too broad and simply logs to console. Consider:

  1. Adding a timeout to the fetch call
  2. Handling specific error types
  3. Propagating errors to the caller
     try {
-        const data = await fetch(endpoint);
+        const controller = new AbortController();
+        const timeoutId = setTimeout(() => controller.abort(), 5000);
+        const data = await fetch(endpoint, { 
+            signal: controller.signal,
+        });
+        clearTimeout(timeoutId);
+        
+        if (!data.ok) {
+            throw new Error(`Failed to fetch schema: ${data.statusText}`);
+        }
+        
         const fullSchema: object = await data.json();
         const schemaRef: string = fullSchema?.["$ref"];
         schema = fullSchema?.[schemaRef];
     } catch (e) {
-        console.log("cannot find schema:", e);
-        schema = {};
+        if (e instanceof Error) {
+            console.error("Schema fetch error:", e.message);
+        }
+        throw new Error("Failed to load schema configuration");
     }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
try {
const data = await fetch(endpoint);
const fullSchema: object = await data.json();
const schemaRef: string = fullSchema?.["$ref"];
schema = fullSchema?.[schemaRef];
} catch (e) {
console.log("cannot find schema:", e);
schema = {};
}
try {
const controller = new AbortController();
const timeoutId = setTimeout(() => controller.abort(), 5000);
const data = await fetch(endpoint, {
signal: controller.signal,
});
clearTimeout(timeoutId);
if (!data.ok) {
throw new Error(`Failed to fetch schema: ${data.statusText}`);
}
const fullSchema: object = await data.json();
const schemaRef: string = fullSchema?.["$ref"];
schema = fullSchema?.[schemaRef];
} catch (e) {
if (e instanceof Error) {
console.error("Schema fetch error:", e.message);
}
throw new Error("Failed to load schema configuration");
}

@oneirocosm oneirocosm merged commit fc298f2 into main Feb 3, 2025
9 of 10 checks passed
@oneirocosm oneirocosm deleted the sylvie/settings-schema branch February 3, 2025 22:20
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant