Skip to content

Commit

Permalink
[MINOR][DOCS] Fix typos in docs/sql-ref-number-pattern.md
Browse files Browse the repository at this point in the history
### What changes were proposed in this pull request?

Fix typos in docs/sql-ref-number-pattern.md

### Why are the changes needed?

Fix typos.

### Does this PR introduce any user-facing change?

No

### How was this patch tested?

Pass the CIs.

Closes apache#47557 from tomscut/doc-typos.

Authored-by: tom03.li <[email protected]>
Signed-off-by: Kent Yao <[email protected]>
  • Loading branch information
tom03.li authored and yaooqinn committed Aug 2, 2024
1 parent aca0d24 commit 6e66be7
Show file tree
Hide file tree
Showing 3 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion docs/spark-standalone.md
Original file line number Diff line number Diff line change
Expand Up @@ -820,7 +820,7 @@ In order to enable this recovery mode, you can set SPARK_DAEMON_JAVA_OPTS in spa
<td><code>spark.deploy.recoveryDirectory</code></td>
<td>""</td>
<td>The directory in which Spark will store recovery state, accessible from the Master's perspective.
Note that the directory should be clearly manualy if <code>spark.deploy.recoveryMode</code>
Note that the directory should be clearly manually if <code>spark.deploy.recoveryMode</code>
or <code>spark.deploy.recoveryCompressionCodec</code> is changed.
</td>
<td>0.8.1</td>
Expand Down
2 changes: 1 addition & 1 deletion docs/sparkr-migration-guide.md
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,7 @@ Please refer [Migration Guide: SQL, Datasets and DataFrame](sql-migration-guide.

## Upgrading from SparkR 3.1 to 3.2

- Previously, SparkR automatically downloaded and installed the Spark distribution in user' cache directory to complete SparkR installation when SparkR runs in a plain R shell or Rscript, and the Spark distribution cannot be found. Now, it asks if users want to download and install or not. To restore the previous behavior, set `SPARKR_ASK_INSTALLATION` environment variable to `FALSE`.
- Previously, SparkR automatically downloaded and installed the Spark distribution in user's cache directory to complete SparkR installation when SparkR runs in a plain R shell or Rscript, and the Spark distribution cannot be found. Now, it asks if users want to download and install or not. To restore the previous behavior, set `SPARKR_ASK_INSTALLATION` environment variable to `FALSE`.

## Upgrading from SparkR 2.4 to 3.0

Expand Down
2 changes: 1 addition & 1 deletion docs/sql-ref-number-pattern.md
Original file line number Diff line number Diff line change
Expand Up @@ -137,7 +137,7 @@ Note that the format string used in most of these examples expects:
> SELECT to_number('1234-', '999999MI');
-1234

-- PR indicates optional wrapping angel brakets.
-- PR indicates optional wrapping angel brackets.
> SELECT to_number('9', '999PR')
9
```
Expand Down

0 comments on commit 6e66be7

Please sign in to comment.