Smart CodeInsight
Smart CodeInsight is an open architecture that allows you to work with your favorite AI engines. RAD Studio also offers you customizable UI integration besides the following implemented features:
- A core AI engine and ToolsAPI.
- IDE integration:
- Support for online and offline LLM providers such as:
- OpenAI, Gemini, and Claude (online).
- Ollama (offline).
Tip: Ollama is distributed as open source under an MIT license. Individual models that you might use with Ollama have their own licenses. You can find the Codellama Acceptable Use Policy on this page.
Note: For ToolsAPI enthusiasts, RAD Studio gives you the opportunity to create your own API plugin. For more information on this, please refer to Create and implement your AI plugin page.
Contents
Configuration Options
RAD Studio gives you extended configuration and full privacy control in multiple ways, allowing you to:
- Turn off the entire AI feature with a single global setting.
- Enable or disable each of the four engines.
- Pick the default engine of the UI elements (chat and editor menu).
You can configure all these features in the IDE Options. Please review the Smart CodeInsight Configuration page for more information about this topic.
- We have also implemented the following actions to ensure security in our system.
- We store the API keys in an encrypted format.
- We include the option to use a local, offline engine.
Editor Menu Commands
The editor menu offers some preset operations on the code selected in the editor itself. The goal is to analyze and optimize a portion of your application's source code.
As an outcome, the LLM engine returns the result in the editor window as a comment after the analyzed code. (add a screenshot)
Available Commands
Check out the available commands for Smart CodeInsight:
- AI Chat: Open the chat view.
- Find Bugs: Try to find potential bugs in the selected code.
- Explain Code: Explain the selected code.
- Add Comment: Add comments to the selected code.
- Complete the code: Complete the selected code.
- Optimize code: Optimize the selected code.
- Add unit test: Add unit test for the selected code.
- Convert to Assembly: Convert the selected code to Assembly code.
- Convert to Delphi: Convert the selected code to Delphi code(from C++ or Assembly).
- Convert to C++ Builder: Convert the selected code to C++ builder code.
AI Chat Window
The AI chat window is an IDE dockable form, which works like any LLM chat window. You can type a request, pick an engine (unless you want to use the default one), and wait for the answer. Here is a quick example:
This chat window has a question memo that can act like a simple command line. These are the available commands:
chatgpt> + Enter
: switch the active AI engine to ChatGpt (if enabled).gemini> + Enter
: switch the active AI engine to Gemini (if enabled).claude> + Enter
: switch the active AI engine to Claude (if enabled).ollama> + Enter
: switch the active AI engine to Ollama (if enabled).clear> + Enter
: clears the answer memo.stop> + Enter
: stops generating answer, the same as when clicking the stop button.Ctrl + Enter
: starts generating the answer, the same as when clicking the start button.