Skip to content

v0.7.0

Compare
Choose a tag to compare
@pavlin-policar pavlin-policar released this 15 Feb 15:07
· 52 commits to master since this release

Changes

  • By default, we now add jitter to non-random initialization schemes. This has almost no effect on the resulting visualizations, but helps avoid potential problems when points are initialized at identical positions (#225)
  • By default, the learning rate is now calculated as N/exaggeration. This speeds up convergence of the resulting embedding. Note that the learning rate during the EE phase will differ from the learning rate during the standard phase. Additionally, we set momentum=0.8 in both phases. Before, it was 0.5 during EE and 0.8 during the standard phase. This, again, speeds up convergence. (#220)
  • Add PrecomputedAffinities to wrap square affinity matrices (#217)

Build changes

  • Build universal2 macos wheels enabling ARM support (#226)

Bug Fixes

  • Fix BH collapse for smaller data sets (#235)
  • Fix updates in optimizer not being stored correctly between optimization calls (#229)
  • Fix inplace=True optimization changing the initializations themselves in some rare use-cases (#225)

As usual, a special thanks to @dkobak for helping with practically all of these bugs/changes.