Language learning with AI: building an AI-powered Anki plugin

When I learn with Anki using decks prepared by other people, the thing I miss the most is examples. I like to see an example usage of the word. Also, learning the word in context works best. Therefore, assuming I want to learn the word das Waffenstillstand (ceasefire), I would prepare a card with the question:


Die beiden Länder haben endlich einen [ceasefire] vereinbart. (The two countries have finally agreed to a ceasefire.)

and the answer

das Waffenstillstand

Die beiden Länder haben endlich einen Waffenstillstand vereinbart.

It takes a lot of effort to prepare word decks like this. Even if you use ChatGPT to generate the sentences, copy-pasting them takes too long. Because of that, I wanted to automate the process.

I wanted an Anki plugin to retrieve an example sentence from ChatGPT, edit the card, and immediately display an updated card. I don’t need an example for every word, so generating a sentence should be an option I have to trigger manually, preferably using a keyboard shortcut.

The Anatomy of an Anki Plugin

An Anki plugin may consist of a single Python file. Anki will load the file from your plugin directory and run whatever code Anki finds inside. You can split the plugin into multiple Python files or even include dependencies inside the plugin directory. However, if all you need is a single menu item, putting all of the code inside the file may be sufficient. I also replaced the requests library with the built-in http.client so I didn’t need to include any dependencies.

Adding a Menu Option

To add a menu item in Anki, we must create a QAction and add the tool menu button. Anki triggers the edit_with_ai function when the user clicks the button. The same action gets triggered when the user presses Ctrl+G or Command+G.

from aqt import mw
from aqt.qt import *


def add_edit_with_ai_button():
    edit_with_ai_button = QAction("Edit with AI", mw)


Preparing an Example with AI

The OpenAI API requires an API key, and we have to store the value somewhere. Putting it directly in the code is a terrible idea, so I stored the key in the ~/.anki/openai file. Therefore, the first part of the script, after the imports, opens the file and reads the API key:

import os

with open(os.path.expanduser("~/.anki/openai")) as f:
    API_KEY =

Finally, we can call the API. As mentioned earlier, I wanted to use only the built-in modules, so the code is a little bit more verbose than its equivalent written with the requests library would be.

In the function, I ask AI to prepare a JSON object containing the sentences I need. I ask it to create an example sentence at the A2 or B1 level because my goal is to make sure the word I learn is the most difficult part of the example sentence. Also, I use the in-context learning technique called one-shot learning to provide an example. Providing examples in the prompt works better than explaining the desired outcome in detail.

import http.client
import json

def chat_completion(api_key, front_text, back_text):
    message = f"""{front_text}

    system_message = """Given a word in English and its translation in German. Return the following things as JSON:

* a sentence (at A2 or B1 German level) in German showing the usage of the word
* the same sentence as in the previous point but with the given word replaced by its English translation and put in square brackets, the rest of the sentence stays in German
* English translation of the sentence


beer coaster
der Bierdeckel

  "sentence": "Bitte legen Sie Ihr Bier auf den Bierdeckel.",
  "sentence_with_translation": "Bitte legen Sie Ihr Beer auf den [beer coaster].",
  "sentence_translation": "Please place your beer on the beer coaster."

    conn = http.client.HTTPSConnection("")
    headers = {
        "Content-Type": "application/json",
        "Authorization": f"Bearer {api_key}"
    payload = {
        "model": "gpt-3.5-turbo",
        "messages": [
                "role": "system",
                "content": system_message
                "role": "user",
                "content": message
    payload_str = json.dumps(payload)
    conn.request("POST", "/v1/chat/completions", body=payload_str, headers=headers)
    res = conn.getresponse()
    data =
    result_as_json =  json.loads(data.decode("utf-8"))

    result = result_as_json['choices'][0]['message']['content']
    result_dict = json.loads(result)
    sentence = result_dict['sentence']
    sentence_with_translation = result_dict['sentence_with_translation']
    sentence_translation = result_dict['sentence_translation']
    return sentence_with_translation, sentence, sentence_translation

Editing a Card

Finally, we can implement the edit_with_ai function. When we call the function, Anki will request the example from ChatGPT, edit the card, and update the UI to display the updated version:

def edit_with_ai():
    current_card = mw.reviewer.card
    if not current_card:
        return  # not in review mode
    note = current_card.note()
    front = note['Front']
    back = note['Back']

    sentence_with_translation, sentence, sentence_translation = chat_completion(API_KEY, front, back)

    front += f"<br><br>{sentence_with_translation} ({sentence_translation})"
    back += f"<br><br>{sentence}"

    note['Front'] = front
    note['Back'] = back

Installing the Plugin

In the local development mode, the plugin’s installation is both easy and annoying. We must open the Anki Add-Ons window and click the “View Files” button. When we do it, the plugin directory will open. To install a new plugin, we copy-paste our plugin code into the directory (every plugin needs a separate directory inside the plugin directory). After that, we have to restart Anki. As I said, it’s easy because all you need is to copy the files, and it’s annoying because you have to restart Anki every time you change anything.

In the case of my plugin, the user installing the plugin must also create the .anki/openai file in their home directory.

Plugin Source Code

I won’t publish the plugin to the official Anki Add-On repository because I’m not interested in providing ongoing support or adding more languages. If you want to use the plugin, copy the code from this article.

Do you need help building AI-powered applications for your business?
You can hire me!

Older post

Debugging, controlling OpenAI usage cost, and monitoring AI applications using Langsmith and GPTBoost

Discover how to effectively monitor AI applications, track costs, and optimize API usage with Langsmith and GPTBoost. Get insights into managing OpenAI API interactions for improved efficiency and cost control.

Newer post

Monitoring employees leaking secret data to AI in ChatGPT with GPTBoost and HuggingChat

How to monitor whether your employees leak company secrets, PII, or passwords to AI