MLOps

11 sessions
March 2024
, Director, Product Architecture, NVIDIA
, Principal Product Architect, NVIDIA
Virtually every organization has heard that they need to establish an MLops practice or at least define an MLops strategy. Because machine learning systems are complex software systems built by cross-functional teams, the ecosystem of tools supporting MLops is rich and diverse. Unfortunately,
October 2024
Lockheed Martin is taking advantage of our system engineering heritage and machine learning operations (MLOps) to meet the standards set by the new Executive Order on AI safety and security. Our framework—implemented in our AI factory MLOps and generative AI ecosystem—will ensure
March 2024
, Distinguished Cloud Architect, Oracle Company
We'll present a comprehensive overview of the integration and optimization of large language model (LLM) training using NVIDIA DGX Cloud on Oracle Cloud Infrastructure (OCI), coupled with efficient inference deployment on OCI's GPU infrastructure. We'll specifically focus on how the NVIDIA Triton
March 2024
, MLOps Engineer, Weights and Biases
This session is focused on fine-tuning large language model (LLM) agents, acquiring crucial insights and techniques for enhancing the performance and specificity of local LLM agents in application automation. We'll explore a variety of essential topics, including: • Fine-tuning techniques: Learn
October 2024
, Vice President and Global Head of Cloud and Edge Business , Tata Communications
, Head of Technology and Platform Development, Tata Communications
Explore how organizations can achieve unparalleled AI performance by seamlessly integrating cloud and edge technologies. This session unveils how we are leveraging scalable NVIDIA GPUs, seamless connectivity through our robust network, and MLOps capabilities across training, fine-tuning, and
March 2024
, Director, Generative AI and LLMOps Platform, NVIDIA
, Director, Product Architecture, NVIDIA
Dive into the world of LLMOps, the next-gen solution for managing Large Language Models (LLMs) in production. Building on MLOps' core capabilities, LLMOps introduces specialized tools and services tailored for the unique challenges of LLMs. Relive an LLMOps journey through the real case study of
March 2024
, M.Sc. ETH Zürich, Product Owner and Data Engineering LOOP Zurich, University Hospital Balgrist
, Product Manager, Azure ML at Microsoft, Microsoft
With the inflection point of AI, medical operations are facing the paradigm shift of digitalization. Medical imaging is a key area for developing high value-added deep learning (DL) models that streamline and accelerate the process of image segmentation and classification, which is a major time-consuming
March 2024
, Senior Manager, DGX Product Architecture, NVIDIA
, Product Manager, DataRobot
In the ever-evolving landscape of artificial intelligence, the demand for high-performance, value-driven solutions has never been greater. Businesses across industries are seeking ways to leverage AI to not only enhance their operations, but also achieve tangible value and return on investment. To address
March 2023
, Senior SW Developer for Autonomous Driving, BMW AG
Machine learning enables self-driving cars to handle the infinite combinations of situations that can’t be covered by a set of rules. However, machine learning development entails much more than crafting a model architecture and learning its hyperparameters from examples. Data, the source of the
March 2023
, CPO, Arize AI
, Global DevRel, MLOps Integrations and Partners, NVIDIA
, Principal ML Specialist Solutions Architect, Amazon Web Services (AWS)
, Senior Director of Product Marketing, ClearML
, Co-Founder and CTO, Iguazio
This panel will discuss and recommend building an MLOps solution for AI workflows and enterprise applications. The need for continuous delivery and automated deployment of AI workloads is a core focus of every organization.
March 2023
, Co-Founder and CEO, DagsHub
We love talking about deploying our machine learning models. One famous (but probably wrong) statement says that “87% of data science projects never make it to production.” But how can we get to the promised land of "Production" if we're not even sure what "Production" even means? If we could