Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

deepL-API Extension: Add, edit and delete Glossary Entries #16

Open
oelgoetz opened this issue Mar 31, 2024 · 2 comments
Open

deepL-API Extension: Add, edit and delete Glossary Entries #16

oelgoetz opened this issue Mar 31, 2024 · 2 comments

Comments

@oelgoetz
Copy link

I want to use the deepL API in a C# project. I want to translate from one into 3 different languages and work with a single glossary for each language pair.

However - a glossary is not static and from time to time it needs to be updated - e.g. by extending it by a new word pair or replacing or deleting an old one.

So far I find only ways to create or delete a complete glossary, or exporting its content.
Using that I could export the whole glossary, apply the described actions on the exported list and replace the original glossary with the new one. When the glossary is big, this seems a ridiculous big effort (and asynchronous task) - and I would end up with a new glossary ID every time.

So please add AddGlossaryEntry(), EditGlossaryEntry(), DeleteGlossaryEntry() methods.

@JanEbbing
Copy link
Member

Hi @oelgoetz , could you describe your use case a bit more clearly - why is it a huge asynchronous effort to apply the action on the whole glossary? Do you get the glossary change requests from someone else?

@oelgoetz
Copy link
Author

oelgoetz commented Apr 2, 2024

Hi Jan,

thanks for the request. It's simply economic considerations. I already implemented the routine: Download the entire glossary, add, modify or delete a single entry, then delete it on the server and upload a new glossary (and save the new ID for future reference). Of course you can do it that way, but the whole process doesn't seem very efficient to me.

Maybe you know it better ...

But my DeepL glossary currently contains less than 100 entries at the moment, nevertheless I already know for sure that there will be a few 1000 - I already keep them in an older translation memory system.
Downloading n*1000 pairs for one change and then uploading (n-1)1000 + n1000 (with thousands of redundancies) again just seems dodgy to me.

@JanEbbing JanEbbing transferred this issue from DeepLcom/deepl-dotnet Apr 3, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants