You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Evo_interface.zip
As you may know, the Chessnut Evo is essentially an Android tablet with a chessboard attached to it. Android apps installed on it can detect the position of the pieces on the board and turn on LEDs to show where the player should move the opponent’s pieces. The company behind it has sent me their official toolkit (attached) which shows how the interface works.
Even though it would probably be very easy to do, I have no illusion that the official Lichess app will ever support these features on the Chessnut Evo directly (but I’d love to be proven wrong 😁).
So, what would be the easiest way for me to add this functionality myself? Would it be best to make my own fork of the app and install that on the board, or could I have some kind of add-on app installed on the board that somehow feeds the input from the board into the official app?
Any other ideas and hints on how to do this would be greatly appreciated! 😊
EDIT: I might add that the Chessnut Evo has a built-in feature called “Chessnut Vision” which uses image recognition to detect chessboards on the screen. Once it finds one, it will then activate the appropriate LEDs on the board, and it will translate your physical moves into moves on the on-screen chessboard. This way, the board does have some kind of ad-hoc support for the official Lichess app. But as you may have guessed, it’s both very slow and prone to errors.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
-
Evo_interface.zip
As you may know, the Chessnut Evo is essentially an Android tablet with a chessboard attached to it. Android apps installed on it can detect the position of the pieces on the board and turn on LEDs to show where the player should move the opponent’s pieces. The company behind it has sent me their official toolkit (attached) which shows how the interface works.
Even though it would probably be very easy to do, I have no illusion that the official Lichess app will ever support these features on the Chessnut Evo directly (but I’d love to be proven wrong 😁).
So, what would be the easiest way for me to add this functionality myself? Would it be best to make my own fork of the app and install that on the board, or could I have some kind of add-on app installed on the board that somehow feeds the input from the board into the official app?
Any other ideas and hints on how to do this would be greatly appreciated! 😊
EDIT: I might add that the Chessnut Evo has a built-in feature called “Chessnut Vision” which uses image recognition to detect chessboards on the screen. Once it finds one, it will then activate the appropriate LEDs on the board, and it will translate your physical moves into moves on the on-screen chessboard. This way, the board does have some kind of ad-hoc support for the official Lichess app. But as you may have guessed, it’s both very slow and prone to errors.
Beta Was this translation helpful? Give feedback.
All reactions