Operations | Monitoring | ITSM | DevOps | Cloud

AI

From MLOps to LLMOps: The evolution of automation for AI-powered applications

Machine learning operations (MLOps) has become the backbone of efficient artificial intelligence (AI) development. Blending ML with development and operations best practices, MLOps streamlines deploying ML models via continuous testing, updating, and monitoring. But as ML and AI use cases continue to expand, a need arises for specialized tools and best practices to handle the particular conditions of complex AI apps — like those using large language models (LLMs).

Democratizing AI Research: How Llama's Accessibility Changes the Game

Artificial Intelligence (AI) is on the brink of unlocking a new era in nearly every industry, from healthcare to manufacturing. However, until recently, conducting AI research remained exclusive to well-funded labs and academics with extensive computational resources and domain knowledge. The emergence of accessible AI research tools like Llama 2 is changing this paradigm, empowering individuals and small teams to contribute to the AI revolution. Here's how Llama's accessibility transforms AI research and why every aspiring AI enthusiast should take note.

Maximize IT efficiency leveraging alert management with Elastic AI Assistant for Observability

Manage and correlate signals and alerts in Elastic Observability As organizations embrace increasingly complex and interconnected IT systems, the sheer volume of alerts generated by diverse monitoring tools has given rise to a critical challenge — how do we efficiently sift through the noise to identify and respond to the most crucial issues? Event management and correlation are two indispensable pillars in the realm of IT service management.

AI realism (part one)

Emotions are running high about AI technologies. In this 2-parter, I do my best to make a rational case on the reality of AI, and how we can respond to it. This is part one; part two next week. We seem to be struggling to have pragmatic discussions about advancements in Artificial Intelligence. It’s hard to hear calmer voices over the detractors and breathless enthusiasts.

AI-powered Autofix debugs & fixes your code in minutes

Sentry knows a lot about the inner workings of an application’s codebase. So we got to thinking, how can we use this rich dataset to make debugging with Sentry even faster? Many generative AI (GenAI) tools (e.g. GitHub Copilot) improve developer productivity in their dev environment, though few have the contextual data Sentry has to help fix errors in production.

Secure your AI workloads with confidential VMs

AI models run on large amounts of good quality data, and when it comes to sensitive tasks like medical diagnosis or financial risk assessments, you need access to private data during both training and inference. When performing machine learning tasks in the cloud, enterprises are understandably concerned about data privacy as well as their model’s intellectual property. Additionally, stringent industry regulations often prohibit the sharing of such data.