Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

There is a bug in rstar 0.12.1, so pin for now. #1261

Closed
wants to merge 2 commits into from

Conversation

michaelkirk
Copy link
Member

@michaelkirk michaelkirk commented Nov 5, 2024

  • I agree to follow the project's code of conduct.
  • I added an entry to CHANGES.md if knowledge of this change could be valuable to users.

FIXES #1260

see: georust/rstar#181

We don't have to do this - we could wait a bit if we think an rstar fix is forthcoming.

@urschrei
Copy link
Member

urschrei commented Nov 5, 2024

Let's hold off until Adam has a look at georust/rstar#181#1254 is open to enable #1246, but #1246 depends on a new API in rstar 0.12.1 so would be blocked by this anyway.

@michaelkirk
Copy link
Member Author

I'm ok to wait for a little bit, but my primary concern isn't the new feature in #1246, but rather on geo users getting incorrect results in the meanwhile.

@michaelkirk
Copy link
Member Author

It looks like resolution is imminent with georust/rstar#184

Anyone opposed to my following up with patch releases to geo-types and geo to require rstar = "0.12.2" ?

@urschrei
Copy link
Member

urschrei commented Nov 5, 2024

Do we need to do that if we yank rstar 0.12.1?

@urschrei
Copy link
Member

urschrei commented Nov 5, 2024

0.12.1 is yanked

@michaelkirk
Copy link
Member Author

I'm not sure!

What happens if you run cargo update -p geo in your project? Is it possible to update geo while keeping the yanked version of rstar around?

@michaelkirk
Copy link
Member Author

michaelkirk commented Nov 5, 2024

Answering my own question: cargo update -p geo will not proactively update it's rstar dependency, even if the current version has been yanked.

Unless we make a new release, the user would need to either run (unqualified) cargo update or know to run cargo update -p rstar.

Proof:

$ cargo update -p geo
    Updating crates.io index
     Locking 0 packages to latest compatible versions
note: pass `--verbose` to see 5 unchanged dependencies behind latest

$ cargo tree | rg rstar
    │   └── rstar v0.12.0
    ├── rstar v0.12.0 (*)

$ cargo update -p rstar
    Updating crates.io index
     Locking 1 package to latest compatible version
    Updating rstar v0.12.0 -> v0.12.2
note: pass `--verbose` to see 4 unchanged dependencies behind latest

$ cargo tree | rg rstar
  Downloaded rstar v0.12.2
  Downloaded 1 crate (43.8 KB) in 0.19s
    │   └── rstar v0.12.2
    ├── rstar v0.12.2 (*)

@urschrei
Copy link
Member

urschrei commented Nov 5, 2024

Annoying. Thanks for figuring that out!

@urschrei
Copy link
Member

urschrei commented Nov 6, 2024

Where did we land on this after yesterday's discussion? My understanding is that there was a 24 hour (ish) period during which, if you updated your geo dependency, or started a new project and included it, you'd end up with a perf-regressed and buggy geo, which WOULD be fixed with a full cargo update but WOULDN'T be fixed with a cargo update -p geo.

@lnicola
Copy link
Member

lnicola commented Nov 6, 2024

I think Cargo givea a warning if you have a yanked crate in your lockfile. Should be relatively easy to test.

@michaelkirk michaelkirk closed this Nov 6, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

CI failing after cargo update
3 participants