InfoQ Homepage Artificial Intelligence Content on InfoQ
-
Microsoft Shares Lessons Learned on Building AI Copilots
Researchers at Microsoft and GitHub Inc. have conducted an in-depth study into the challenges, opportunities, and needs associated with building AI-powered product copilots. The research involved interviews with 26 professional software engineers from various companies who are responsible for developing these advanced tools.
-
InfoQ & QCon Events: Level up on Generative AI, Security, Platform Engineering, and More Upcoming
As we navigate through these transformative times, the upcoming InfoQ events stand as a platform to help you stay ahead, learn valuable insights, and find practical solutions to your development challenges in 2024 and beyond. The events are carefully curated for senior software engineers, architects, and team leaders, offering practitioner insights into emerging trends, patterns, and practices.
-
NVIDIA Introduces Metropolis Microservices for Jetson to Run AI Apps at the Edge
NVIDIA has expanded its Nvidia Metropolis Microservices Cloud-based AI solution to run on the NVIDIA Jetson IoT embedded platform, including support for video streaming and AI-based perception.
-
OpenAI Releases New Embedding Models and Improved GPT-4 Turbo
OpenAI recently announced the release of several updates to their models, including two new embedding models and updates to GPT-4 Turbo and GPT-3.5 Turbo. The company also announced improvements to their free text moderation tool and to their developer API management tools.
-
Java News Roundup: LibericaJDK with RISC-V, Payara Platform, Gradle 8.6, LangChain4j, Spring Cloud
This week's Java roundup for January 29th, 2024, features news highlighting: LibericaJDK 21 with support for RISC-V, January release of Payara Platform, Gradle 8.6, LangChain4j 0.26, GraalVM Native Build Tools 0.10, and multiple releases of Open Liberty and Eclipse Vert.x.
-
Meta Releases Code Generation Model Code Llama 70B, Nearing GPT-3.5 Performance
Code Llama 70B is Meta's new code generation AI model. Thanks to its 70 billion parameters, it is "the largest and best-performing model in the Code Llama family", Meta says.
-
Stability AI Releases 1.6 Billion Parameter Language Model Stable LM 2
Stability AI released two sets of pre-trained model weights for Stable LM 2, a 1.6B parameter language model. Stable LM 2 is trained on 2 trillion tokens of text data from seven languages and can be run on common laptop computers.
-
Hugging Face and Google Cloud Announce Collaboration
Hugging Face and Google Cloud have announced a strategic alliance to advance machine learning and open AI research. Google Cloud customers, Hugging Face Hub users, and open source are the three main focuses of the strategic partnership. Google wants to make cutting-edge AI discoveries available through Hugging Face's open-source frameworks.
-
Harnessing AI-Generated Cloudformation with Application Composer
The AWS Toolkit for VS Code has recently extended its support to include AWS Application Composer, introduced a year ago in the AWS Management Console. This enhancement empowers users to seamlessly craft Infrastructure as Code (IaC) for a comprehensive range of over 1100 AWS CloudFormation resources.
-
Visual Studio GitHub Copilot Extension Introduces New Features and Enhancements
The latest release of the Visual Studio GitHub Copilot Chat Extension introduces two noteworthy productivity features, slash commands and context variables. Additionally, developers can explore preview features like the Exception Assistant, Test failure Analysis, Suggestions for Breakpoint Expressions, Commit message suggestions and many more.
-
Microsoft Copilot: Copilot Pro, Copilot for Microsoft 365, Copilot GPT and More
Microsoft has released Copilot Pro and Copilot for Microsoft 365, and is providing free access to those tools for smaller organizations and educational faculty. They also created the Copilot mobile application. Moreover, Copilot is also available in the Microsoft 365 mobile application.
-
LeftoverLocals May Leak LLM Responses on Apple, Qualcomm, and AMD GPUs
Security firm Trail of Bits disclosed a vulnerability allowing malicious actors to recover data from GPU local memory on Apple, Qualcomm, AMD, and Imagination GPUs. Dubbed LeftoverLocals, the vulnerability affects any application using the GPU, including Large Language Models (LLMs) and machine learning (ML) models.
-
Mistral AI's Open-Source Mixtral 8x7B Outperforms GPT-3.5
Mistral AI recently released Mixtral 8x7B, a sparse mixture of experts (SMoE) large language model (LLM). The model contains 46.7B total parameters, but performs inference at the same speed and cost as models one-third that size. On several LLM benchmarks, it outperformed both Llama 2 70B and GPT-3.5, the model powering ChatGPT.
-
Using ChatGPT for Amplifying Software Testing Practices and Assisting Software Delivery
Artificial intelligence can assist software delivery and be used to automate software testing and optimize project work. Dimitar Panayotov uses ChatGPT to generate test data, create email templates, and produce explanations based on test results. This saves him time that he can invest to become more productive.
-
GitHub Copilot Chat Now Generally Available
GitHub Copilot Chat, a natural language-powered coding tool, is now generally available, according to a recent announcement by GitHub. The tool, a part of GitHub Copilot, is designed to elevate natural language as a universal programming language. GitHub Copilot Chat, powered by GPT-4, is a contextually-aware AI assistant tailored for development scenarios.