Operations | Monitoring | ITSM | DevOps | Cloud

AI

Open Source MLOps on AWS

With the rise of generative AI, enterprises are growing their AI budgets, looking for options to quickly set up the infrastructure and run the entire machine learning cycle. Cloud providers like AWS are often preferred to kick-start AI/ML projects as they offer the computing power to experiment without long-term commitments. Starting on the cloud takes away the burden of computing power, reducing start-up time and cost and allowing teams to iterate more quickly.

The generative AI societal shift

Once upon a time, not so long ago, the world was a different place. The idea of a "smartphone" was still a novelty, and the mobile phone was primarily a tool for making calls and, perhaps, sending the occasional text message. Yes, we had "smart" phones, but they were simpler, mostly geared toward business users and mostly used for, well, phone stuff. Web browsing? It was there, but light, not something you'd do for hours.

Top Trends in DevOps - ChatGPT

The world of DevOps is constantly evolving and adapting to the needs of the software development industry. With the increasing demand for faster and more efficient software delivery, organizations are turning to modern technologies and practices to help them meet these challenges. In a series of articles on the Kublr blog, we will take a look at some of today’s top DevOps trends.

How to observe your TensorFlow Serving instances with Grafana Cloud

The world of AI and machine learning has evolved at an accelerated pace these past few years, and the advent of ChatGPT, DALL-E, and Stable Diffusion has brought a lot of additional attention to the topic. Being aware of this, Grafana Labs prepared an integration for monitoring one of the most used machine learning model servers available: TensorFlow Serving. TensorFlow Serving is an open source, flexible serving system built to support the use of machine learning models at scale.

How Generative AI Can Benefit Your Knowledge Management

There has been growing interest in the capabilities of generative AI since the release of tools like ChatGPT, Google Bard, Amazon Large Language Models and Microsoft Bing. With the hype comes concerns about privacy, PII, security and, even more importantly, accuracy. And rightly so. Organizations are treading cautiously with their acceptance of generative AI tools, despite seeing them as a game changer.

Impact of AI on IT Operations

The rise of Artificial Intelligence in every domain is very apparent, and as a result, the impact of AI on IT operations needs to be comprehended by one and all. AI, or artificial intelligence, is a field of computer science that focuses on developing intelligent machines that can perform tasks that typically require human intelligence and decision-making. But what exactly are IT operations?

Five worthy reads: Generative AI can revolutionize healthcare-but is there a catch?

Five worthy reads is a regular column on five noteworthy items we’ve discovered while researching trending and timeless topics. This week we are exploring how generative AI can change the future of healthcare. Step into the world of generative AI, the fascinating field of AI that’s got everyone in the healthcare industry buzzing.

Unleashing the Potential: Exploring the Impact of Artificial Intelligence (AI) in Education

Discover the transformative role of artificial intelligence (AI) in education, revolutionizing the learning experience for both students and educators. Explore how AI enhances access to information, provides personalized feedback, and empowers learners through smart learning apps.

Improving LLMs in Production With Observability

Quickly: if you’re interested in observability for LLMs, we’d love to talk to you! And now for our regularly scheduled content: In early May, we released the first version of our new natural language querying interface, Query Assistant. We also talked a lot about the hard stuff we encountered when building and releasing this feature to all Honeycomb customers. But what we didn’t talk about was how we know how our use of an LLM is doing in production!