Operations | Monitoring | ITSM | DevOps | Cloud

Canonical

OpenStack with Sunbeam as an on-prem extension of the OpenStack public cloud

One of the biggest challenges that cloud service providers (CSPs) face these days is to deliver an extension of the public cloud they host to a small-scale piece of infrastructure that runs on customers’ premises. While the world’s tech giants, such as Amazon or Azure, have developed their own solutions for this purpose, many smaller, regional CSPs rely on open source projects like OpenStack instead.=

AI on Public Cloud with Open Source

AI is at the heart of a revolution in the technology space. Organisations from all industries are looking for ways to put AI to work. Once they have finalised use case assessment, their next question is typically related to the environment they will use to develop and deploy their AI initiatives. They often prefer the public clouds as an initial environment, because of the computing power and ability to scale as projects mature. In addition to the infrastructure, enterprises need software where they can develop and deploy the machine learning models.

AI and automotive: navigating the roads of tomorrow

I had the pleasure to be invited by Canonical’s AI/ML Product Manager, Andreea Munteanu, to one of the recent episodes of the Canonical AI/ML podcast. As an enthusiast of automotive and technology with a background in software, I was very eager to share my insights into the influence of artificial intelligence (AI) in the automotive industry.

Generative AI with Ubuntu on AWS. Part II: Text generation

In our previous post, we discussed how to generate Images using Stable Diffusion on AWS. In this post, we will guide you through running LLMs for text generation in your own environment with a GPU-based instance in simple steps, empowering you to create your own solutions. Text generation, a trending focus in generative AI, facilitates a broad spectrum of language tasks beyond simple question answering.

What is a telco cloud?

Telecommunications companies (telcos) are well on their way to transforming their infrastructure from the legacy, unadaptable, complex network of dedicated hardware from yesteryears to agile, modular and scalable software-defined systems running on common off-the-shelf (COTS) servers. Within this space, the current trend, driven by 5G deployments, is to complement tried and tested network function virtualisation (NFV) infrastructure with cloud-native network functions (CNFs).

Ubuntu AI | S2E4 | AI on public cloud: what should you know?

Weka report from 2024 showed that 47% of respondents will use the public cloud as the primary place to develop their machine learning projects. This is a result of a correlation of factors which include the need for compute power, easy scalability, and the ability to utilise existing infrastructure already in place on both hybrid clouds and public clouds. Join us to talk more about AI on the public cloud: what are the main benefits and what are the best practices an organisation could implement in order to easier adopt AI and leverage the most the public clouds.