Microsoft released a new Visual Studio Code extension called Prompty, designed to integrate Large Language Models (LLMs) like GPT-4o directly into .NET development workflows. This free tool aims to simplify the process of adding AI-driven capabilities to applications. The official release post includes a practical example demonstrating how Prompty can be used in real-world scenarios.
Prompty is available for free on the Visual Studio Code Marketplace and offers .NET developers an intuitive interface to interact with LLMs. Whether creating chatbots, generating content, or enhancing other AI-driven functionalities, Prompty provides an easy way to integrate these capabilities into existing development environments.
While Prompty has been well-received for its innovative approach to integrating AI into .NET development, some community members have expressed concerns about its availability. On LinkedIn, Jordi Gonzalez Segura expressed disappointment that Prompty is not accessible to those using Visual Studio Professional.
Using Prompty in Visual Studio Code involves several steps:
- Installation: Developers start by installing the Prompty extension from the Visual Studio Code Marketplace.
- Setup: After installation, users configure the extension by providing API keys and setting up necessary parameters to connect to the LLM, such as GPT-4o.
- Integration: Prompty integrates into the development workflow by allowing users to create new files or modify existing ones with embedded prompts. Commands and snippets are provided to easily insert prompts and handle responses from the LLM.
- Development: Developers can write prompts directly in the codebase to interact with the LLM. Prompty supports various prompt formats and offers syntax highlighting, making it easier to read and maintain prompts. These prompts help in generating code snippets, creating documentation, or troubleshooting by querying the LLM for specific issues.
- Testing: Prompty enables rapid iteration and testing, allowing developers to refine prompts to improve response accuracy and relevance.
Bruno Capuano, a principal cloud advocate at Microsoft, prepared a real-world use case demonstrating how Prompty can enhance a .NET WebAPI project. In this example, Prompty generates detailed weather forecast descriptions, transforming the standard output into richer and more engaging content. Developers can dynamically produce detailed weather summaries by defining prompts within a .prompty file and setting up the necessary configurations to interact with the LLM. This practical application leverages the Semantic Kernel to automate the generation of descriptive text, improving the quality and informativeness of the application’s output.
Additional information about Prompty, its features, and integration into development workflows can be found on the Prompty Visual Studio Code extension page or in the Prompty source code available on GitHub.