EmEditor v23.1.0 released (including technical review)

Today, we are releasing EmEditor v23.1.0.

In the previous version (v23.0), we introduced how to access the web page of the generated AI using macros and the built-in Web Browser in EmEditor and obtain various information and services. However, customers who have a paid API key for the generated AI site can obtain faster, higher-quality services with more stable operation by directly calling the API. To do this, the fetch function in JavaScript is used, but since this function operates asynchronously, the return value of the function may not be obtained before the macro ends. Actually, even in v23.0, it was possible to use this method if the built-in Web Browser in EmEditor was displayed, but there was a problem that async functions could not be used if the Web Browser was not displayed. In this new version (v23.1), by using the KeepRunning property, it is now possible to wait for the completion of the async function without ending the macro (EmEditor Professional only). Before calling the async function, specify the KeepRunning property as follows.


shell.KeepRunning = true;

This keeps the macro running, allowing you to obtain the return value of the async function. To end the macro after obtaining the return value of the async function, you can specify it as follows:


shell.KeepRunning = false;

This is almost equivalent to Quit() when waiting for the completion of async functions, and the macro ends immediately.

Below is a sample macro using the fetch function to utilize the OpenAI API. To run this sample macro, you need to replace {your-API-key} with your API key. When you run the following macro, it sends the question “How are you?” to the OpenAI API and displays the answer in the output bar.


#language="v8"

const apiKey = "{your-API-key}";

/**
 * Sends prompt to OpenAI and returns the response.
 * Docs: https://platform.openai.com/docs/guides/text-generation/chat-completions-api?lang=curl
 * @param {string} endpoint URL for request
 * @param {string} apiKey API key
 * @param {string} messageContent The prompt
 * @returns {string} The text content of the response
 */
async function callOpenAI(endpoint, apiKey, messageContent) {
    const response = await fetch(
        endpoint,
        {
            method: "POST",
            headers: {
                "Authorization": `Bearer ${apiKey}`,
                "content-type": "application/json",
            },
            body: JSON.stringify({
                "model": "gpt-3.5-turbo",
                "messages": [
                    {
                        "role": "user",
                        "content": messageContent,
                    }
                ],
            }),
        }
    );
    if (!response.ok) {
        alert(await response.text());
        Quit();
    }

    const responseObj = await response.json();
    if (responseObj.choices.length == 0) {
        alert("choices length is 0");
        Quit();
    }

    // Get content of first choice
    return responseObj.choices.pop().message.content;
}

async function main() {
    const endpoint = "https://api.openai.com/v1/chat/completions";
    const sPrompt = "How are you?";
    shell.KeepRunning = true;
    const response = await callOpenAI(endpoint, apiKey, sPrompt);
    OutputBar.writeln( response );
    OutputBar.Visible = true;
    shell.KeepRunning = false;
}
main();

The CharOpenAI.jsee macro example further extends this sample by displaying a popup menu of commonly used questions (“Proofread”, “Summarize”, “Look up”, “Translate”…) for the currently selected text or entire document. When a question is selected, it sends the question to the OpenAI API and displays the answer in the output bar.

Actually, I am using the CharOpenAI.jsee macro to translate and proofread while writing this blog. By combining the generated AI with a text editor, I have been able to significantly improve my work efficiency.

Another major change in v23.1 is the speed improvement when handling large files. In v23.0, changes to lines were stored in memory instead of temporary files for faster operation. However, in systems with limited memory, this could result in slowness or errors due to insufficient system memory. In v23.1, the memory-related algorithms have been revised to operate more efficiently. Additionally, when virtual memory becomes insufficient, it now uses temporary files to store data. As a result, users no longer need to worry about the size of virtual memory, and the frequency of crashes due to memory shortage has been significantly reduced. The improvements in code related to memory, as well as the use of multi-threading and the SIMD instruction set, have resulted in a speed increase of 1.51 to 41.2 times faster than v23.0 for many commands when editing huge files, including CSV files.

The Help feature now defaults to using an external browser instead of EmEditor’s built-in Web Browser, similar to v22.5 and earlier versions. Furthermore, the Help page has been added to the Customize dialog box, allowing users to change settings related to Help.

Lastly, Makoto Emura added the Completion List feature using the Language Server Protocol (LSP). To utilize this feature, the Language Server Protocol must be enabled in the Language Server page of configuration properties, and the Show completion list option is also enabled (EmEditor Professional only). Currently, only JavaScript supports this feature.

I hope you like EmEditor, whether you use the Professional or Free version. Please contact us or write in forums if you have any questions, feature requests, or any ideas in the future.

Thank you for using EmEditor!
Yutaka Emura

Please see EmEditor v23.1 New Features for details and screenshots.

This release also includes all bug fixes while developing v23.1.

If you use the Desktop Installer version, you can select Check for Updates on the Help to download the newest version. If this method fails, please download the newest version, and run the downloaded installer. If you use the Desktop portable version, you can go to the Download page to download the newest version. The Store App versions can be updated through Microsoft Store (64-bit or 32-bit) after a few days.