Home DEVELOPER Support AI-powered DevOps with on-premises LLM

Support AI-powered DevOps with on-premises LLM

0


Generative language models can not only generate, classify and summarize text, but also seek to revolutionize software development. Barely a month goes by without a new language model being introduced, which is trained with billions of source code tokens from different programming languages ​​and dialects. While early code wizards were good at completing existing code or generating code based on entered comments, current LLMs can assist across the entire software lifecycle.

The advantage of this is that in the DevOps process, developers no longer have to leave the development environment to create different code assets. Not just exclusive products like GitHub co-pilot Or codiaum Support with plug-ins for common integrated development environments is promised.

  • Anyone who uses code assistants like GitHub Copilot or Codium as software as a service will need to push their source code to the cloud. This creates legal challenges for some companies.
  • In addition to large language models from the cloud, the Continue.Dev plug-in for Visual Studio Code and JetBrains products can also integrate local or self-powered LLM.
  • In addition to code completion, Continue.Dev integrates separately configurable chat models and embeddings so you can query your own code.
  • The plug-in supports developers in all stages of the DevOps process, such as deploying services to cloud providers.
CLC 2024: Keynote Lecture on Platform Modernization and Artificial Intelligence




Ramon Wartla is Director of Data Science at Accenture Song in Hamburg. As a consultant, he and his team design and implement data architectures for their clients’ machine learning solutions.

There are now many IDE plug-ins that require very different requirements for use. products like codegpt For example, OpenAI requires access to GPT-4 or the AWS Bedrock service and are only free in the basic version. Other plugins like codeassist JetBrains products communicate with servers in the provider’s cloud.

This was an excerpt from our Heise Plus article “Supporting AI-enabled DevOps with Local LLM”. With a Heise Plus subscription you can read and listen to the full article.


70mai Dash Cam A510, Dashboard Camera

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version