Embarcadero offers Smart CodeInsight in the IDE, and a separate AI component pack — available for download via the GetIt Package Manager — called SmartCore AI.
Smart CodeInsight offers an AI chat pane integrated in the RAD Studio, Delphi or C++Builder IDE. It can answer questions you type and provides direct commands that you can run on selected text in the RAD Studio editor, using the editor’s local context menu.
The commands perform standard actions like adding comments, checking for security flaws, creating unit tests, and more.
In the Tools Options dialog box, there is a Smart CodeInsight configuration page. On this page you can enable the feature as a whole, configure and enable the available backend LLMs you want to use, and separately select which of the LLMs you want to power the chat and the editor commands.
No. After installing RAD Studio, Delphi or C++Builder 13 Florence, there are no AI components present or active until you enable the Smart CodeInsight feature in the Tools Options dialog box or install the SmartCore AI component pack via GetIt. To use the AI capabilities, you’ll need to separately create an account with the appropriate AI provider or configure a local offline LLM to enable AI features and process data locally.
Yes. You can disable the AI feature at any time in the Tools Options dialog box. You can also change the backend LLM used by the RAD Studio IDE at any time.
This fully depends on the backend LLM you select. In the case of Ollama and other local engines, no data leaves your computer. In case you select OpenAI, Gemini, or Claude, the data you specifically select is sent to a server, subject to the specific AI provider’s license agreement, terms of use, and privacy and IP protection offered.
The data is not sent to an Embarcadero backend and we do not receive or process your data in any way, including to train any model.
Aside from using Ollama and a local LLM — which require a one-time download handled separately by the developer and can then run locally after configuration — all LLMs provided through one of the supported AI providers listed above need an active internet connection.
We support 4 different APIs and 4 different LLM vendors. Each of them offers different models. In the RAD Studio IDE, after you configure the endpoint and provide your access key (if needed), you can use a combobox to see and select one of the available models. Different models may have different associated costs.
All of the configuration settings are in the Tools Options dialog box, or in the case of the SmartCore AI component pack, in the component settings.
The SmartCore AI Component Pack is an additional set of components you can use to add AI features to your own applications built with RAD Studio, Delphi or C++Builder. This set of components is not installed by default with RAD Studio, as it’s an additional download available in the GetIt Package Manager, accessible from the RAD Studio IDE.
You have to download the component pack from GetIt and add the AI Connection and other components to your applications written with RAD Studio, Delphi or C++Builder. In the AI Connection component, you can configure which LLM to use (local or hosted) and provide your API key in case of paid, online solutions.