Skip to content

Commit

Permalink
Merge branch 'master' into add-all_touched-parameter
Browse files Browse the repository at this point in the history
# Conflicts:
#	common/src/test/java/org/apache/sedona/common/raster/RasterConstructorsTest.java
  • Loading branch information
prantogg committed Feb 10, 2025
2 parents ba71f5d + 0127688 commit 4f08c9c
Show file tree
Hide file tree
Showing 149 changed files with 17,220 additions and 689 deletions.
3 changes: 3 additions & 0 deletions .github/workflows/python-wheel.yml
Original file line number Diff line number Diff line change
Expand Up @@ -54,6 +54,9 @@ jobs:
if: runner.os == 'Linux'
uses: docker/setup-qemu-action@v3
with:
# temporarily pin to qemu@v8 to workaround non-determininstic gcc segfaults
# https://github.com/docker/setup-qemu-action/issues/188
image: tonistiigi/binfmt:qemu-v8.1.5
platforms: all
- name: Build wheels
uses: pypa/[email protected]
Expand Down
8 changes: 2 additions & 6 deletions .github/workflows/python.yml
Original file line number Diff line number Diff line change
Expand Up @@ -163,13 +163,9 @@ jobs:
- name: Run Spark Connect tests
env:
PYTHON_VERSION: ${{ matrix.python }}
SPARK_VERSION: ${{ matrix.spark }}
if: ${{ matrix.spark >= '3.4.0' }}
run: |
if [ ! -f "${VENV_PATH}/lib/python${PYTHON_VERSION}/site-packages/pyspark/sbin/start-connect-server.sh" ]
then
echo "Skipping connect tests for Spark $SPARK_VERSION"
exit
fi
export SPARK_HOME=${VENV_PATH}/lib/python${PYTHON_VERSION}/site-packages/pyspark
export SPARK_REMOTE=local
Expand Down
34 changes: 24 additions & 10 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -26,12 +26,14 @@ repos:
- repo: meta
hooks:
- id: identity
name: run identity check
- id: check-hooks-apply
name: run check hooks apply
- repo: https://github.com/Lucas-C/pre-commit-hooks
rev: v1.5.5
hooks:
- id: insert-license
name: Add license for all Markdown files
name: add license for all Markdown files
files: \.md$
args:
- --comment-style
Expand All @@ -41,7 +43,7 @@ repos:
- --fuzzy-match-generates-todo
exclude: ^docs/index\.md$|^\.github/pull_request_template\.md$|\.github/issue_template\.md$
- id: insert-license
name: Add license for all Makefile files
name: add license for all Makefile files
files: ^Makefile$
args:
- --comment-style
Expand All @@ -50,7 +52,7 @@ repos:
- .github/workflows/license-templates/LICENSE.txt
- --fuzzy-match-generates-todo
- id: insert-license
name: Add license for all TOML files
name: add license for all TOML files
files: \.toml$
args:
- --comment-style
Expand All @@ -59,7 +61,7 @@ repos:
- .github/workflows/license-templates/LICENSE.txt
- --fuzzy-match-generates-todo
- id: insert-license
name: Add license for all YAML files
name: add license for all YAML files
files: \.ya?ml$
args:
- --comment-style
Expand All @@ -71,33 +73,43 @@ repos:
rev: 24.10.0
hooks:
- id: black-jupyter
name: run black-jupyter
description: format Python files and Jupyter Notebooks with black
- repo: https://github.com/pre-commit/mirrors-clang-format
rev: v19.1.4
hooks:
- id: clang-format
name: run clang-format
description: format C files with clang-format
args: [--style=Google]
types_or: [c]
- repo: https://github.com/PyCQA/bandit
rev: 1.7.10
hooks:
- id: bandit
name: run bandit
description: check Python code for security issues
args: ["-c=pyproject.toml", "-r"]
- repo: https://github.com/codespell-project/codespell
rev: v2.3.0
hooks:
- id: codespell
name: Run codespell
description: Check spelling with codespell
name: run codespell
description: check spelling with codespell
args: [--ignore-words=.github/linters/codespell.txt]
exclude: ^docs/image|^spark/common/src/test/resources|^docs/usecases|^tools/maven/scalafmt
- repo: https://github.com/gitleaks/gitleaks
rev: v8.21.2
hooks:
- id: gitleaks
name: run gitleaks
description: check for secrets with gitleaks
- repo: https://github.com/shssoichiro/oxipng
rev: v9.1.2
hooks:
- id: oxipng
name: run oxipng
description: check PNG files with oxipng
args: ["-o", "4", "--strip", "safe", "--alpha"]
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v5.0.0
Expand Down Expand Up @@ -151,8 +163,8 @@ repos:
rev: v0.43.0
hooks:
- id: markdownlint
name: Run markdownlint
description: Check Markdown files with markdownlint
name: run markdownlint
description: check Markdown files with markdownlint
args: [--config=.github/linters/.markdown-lint.yml]
exclude: ^\.github/.*$
types: [markdown]
Expand All @@ -161,12 +173,14 @@ repos:
rev: v0.10.0.1
hooks:
- id: shellcheck
name: run shellcheck
description: check Shell scripts with shellcheck
- repo: https://github.com/adrienverge/yamllint
rev: v1.35.1
hooks:
- id: yamllint
name: Run yamllint
description: Check YAML files with yamllint
name: run yamllint
description: check YAML files with yamllint
args: [--strict, -c=.github/linters/.yaml-lint.yml]
types: [yaml]
files: \.ya?ml$
44 changes: 44 additions & 0 deletions CONTRIBUTING.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
<!--
Licensed to the Apache Software Foundation (ASF) under one
or more contributor license agreements. See the NOTICE file
distributed with this work for additional information
regarding copyright ownership. The ASF licenses this file
to you under the Apache License, Version 2.0 (the
"License"); you may not use this file except in compliance
with the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed to in writing,
software distributed under the License is distributed on an
"AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
KIND, either express or implied. See the License for the
specific language governing permissions and limitations
under the License.
-->

# How to contribute to Apache Sedona

Welcome! We'd love to have you contribute to Apache Sedona!

## Did you find a bug?

Create an issue with a reproducible example. Please specify the Sedona version, Java version, code snippet, and error message.

## Did you create a PR to fix a bug?

See [here](https://sedona.apache.org/latest/community/rule/#make-a-pull-request) for instructions on how to open PRs.

We appreciate bug fixes - thank you in advance!

## Would you like to add a new feature or change existing code?

If you would like to add a feature or change existing behavior, please make sure to create an issue/JIRA ticket and get the planned work approved by the core team first!

It's always better to get aligned with the core devs before writing any code.

## Do you have questions about the source code?

Feel free to create an issue or join the [Discord](https://share.hsforms.com/1Ndql_ZigTdmLlVQc_d1o4gqga4q) with questions!

Thanks for reading and looking forward to collaborating with you!
4 changes: 2 additions & 2 deletions R/R/data_interface.R
Original file line number Diff line number Diff line change
Expand Up @@ -435,7 +435,7 @@ spark_read_shapefile <- function(sc,

lapply(names(options), function(name) {
if (!name %in% c("")) {
warning(paste0("Ignoring unknown option '", name,"'"))
warning(paste0("Ignoring unknown option '", name, "'"))
}
})

Expand Down Expand Up @@ -470,7 +470,7 @@ spark_read_geojson <- function(sc,
if ("skip_syntactically_invalid_geometries" %in% names(options)) final_skip <- options[["skip_syntactically_invalid_geometries"]] else final_skip <- TRUE
lapply(names(options), function(name) {
if (!name %in% c("allow_invalid_geometries", "skip_syntactically_invalid_geometries")) {
warning(paste0("Ignoring unknown option '", name,"'"))
warning(paste0("Ignoring unknown option '", name, "'"))
}
})

Expand Down
6 changes: 3 additions & 3 deletions R/tests/testthat/test-data-interface-raster.R
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ test_that("Passed RS_FromGeoTiff from binary", {
mutate(raster = RS_FromGeoTiff(content))

expect_equal(
raster_sdf %>% sdf_schema() ,
raster_sdf %>% sdf_schema(),
list(
path = list(name = "path", type = "StringType"),
modificationTime = list(name = "modificationTime", type = "TimestampType"),
Expand Down Expand Up @@ -65,7 +65,7 @@ test_that("Passed RS_FromArcInfoAsciiGrid from binary", {
mutate(raster = RS_FromArcInfoAsciiGrid(content))

expect_equal(
raster_sdf %>% sdf_schema() ,
raster_sdf %>% sdf_schema(),
list(
path = list(name = "path", type = "StringType"),
modificationTime = list(name = "modificationTime", type = "TimestampType"),
Expand Down Expand Up @@ -101,7 +101,7 @@ test_that("Passed RS_Envelope with raster", {
)

expect_equal(
raster_sdf %>% sdf_schema() ,
raster_sdf %>% sdf_schema(),
list(
path = list(name = "path", type = "StringType"),
modificationTime = list(name = "modificationTime", type = "TimestampType"),
Expand Down
2 changes: 1 addition & 1 deletion R/vignettes/articles/apache-sedona.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -447,7 +447,7 @@ Or change at runtime:
```{r}
spark_session(sc) %>%
invoke("conf") %>%
invoke("set", "sedona.global.index","false")
invoke("set", "sedona.global.index", "false")
invoke_new(sc, "org.apache.sedona.core.utils.SedonaConf", invoke(spark_session(sc), "conf"))
```
Expand Down
2 changes: 1 addition & 1 deletion R/vignettes/articles/raster.Rmd
Original file line number Diff line number Diff line change
Expand Up @@ -107,7 +107,7 @@ dir(dest_file, recursive = TRUE)
Available options see [Raster writer](../../../api/sql/Raster-writer/):

* rasterField: the binary column to be saved (if there is only one takes that column by default, otherwise specify)
* fileExtension: `.tiff` bvy default, also accepts `.png`, `.jpeg`, `.asc`
* fileExtension: `.tiff` by default, also accepts `.png`, `.jpeg`, `.asc`
* pathField: if used any column name that indicates the paths of each raster file, otherwise random UUIDs are generated.

```{r}
Expand Down
Loading

0 comments on commit 4f08c9c

Please sign in to comment.