Confluent OEM Program: Grow faster with enterprise-grade data streaming | Learn More
ELT pipelines create duplicate datasets and costs. A “shift left” offers a better approach. We'll explore best practices to make data accessible across operational, analytical, and hybrid systems using data primitives such as streams, tables, schemas, and Apache Iceberg.
Discover how organizations are cleaning up their data mess by making the shift to a data products mindset. Learn what data products are, how they deliver business value, and the best way to build them with data streaming platforms that drive real-time experiences faster and more cost-effectively.
Discover how to unlock the full potential of your SAP data with Confluent Cloud and real-time data streaming. Learn key use cases, industry examples, and more.
See BMW, Michelin, and Siemens use Apache Kafka for event-driven systems and how your manufacturing organization run event-driven microservices with data streaming.
Leia o Data Streaming Report 2024 e descubra como o streaming de dados está sendo usado por empresas de diversos setores para aumentar o ROI, impulsionar inovações revolucionárias em inteligência artificial e melhorar a experiência do cliente.
Learn how to take full advantage of Apache Kafka®, the distributed, publish-subscribe queue for handling real-time data feeds.
Jay Kreps, CEO of Confluent and original co-creator of Apache Kafka, shows how logs work in distributed systems, and provides practical applications of these concepts.
Shoe retailer NewLimits is struggling with decentralized data processing challenges and needs a manageable, cost-effective stream processing solution for an important upcoming launch. Join developer Ada and architect Jax as they learn why Apache Kafka and Apache Flink are better together.
Download this Forrester study to understand the economic benefits of Confluent Cloud.
In this white paper, we provide a holistic overview of an active-passive multi-region DR solution based on the capabilities of Confluent Cloud, the only fully managed, cloud-native service for Apache Kafka.
Learn how to successfully implement a data mesh and build data products using Confluent’s data streaming platform, leveraging connectors, stream processing, and Stream Governance.
Explore the 2024 Data Streaming Report to discover the trends and tactics IT leaders are leveraging to boost ROI, AI adoption, and innovation with data streaming.
Learn how Apache Kafka, Confluent, and event-driven microservices ensure real-time communication and event streaming for modernized deployment, testing, and continuous delivery.
A practical guide to configuring multiple Apache Kafka clusters so that if a disaster scenario strikes, you have a plan for failover, failback, and ultimately successful recovery.
Businesses are discovering that they can create new business opportunities as well as make their existing operations more efficient using real-time data at scale. Learn how real-time data streams is revolutionizing your business.
This whitepaper discusses how to optimize your Apache Kafka deployment for various services goals including throughput, latency, durability and availability. It is intended for Kafka administrators and developers planning to deploy Kafka in production.
This white paper provides a brief overview of how microservices can be built in the Apache Kafka ecosystem.
To succeed, insurance companies must unify data from all their channels that may be scattered across multiple legacy systems as well as new digital applications. Without the ability to access and combine all this data in real time, delivering a truly modern insurance experience while assessing fast-changing risks can be an uphill battle. Our eBook explains how event streaming, an emerging technology for analyzing event data in real time, can help insurers compete with their insuretech peers. You will learn how combining event streaming from Apache Kafka® and Confluent with Google Cloud can help you.
Every one of your customer touch points, from an actual purchase to a marketing engagement, creates data streams and opportunities to trigger automations in real time.
In this ebook, you’ll get a look at five of the common use cases when getting started with data streaming, with real-world customer examples and insights into how your organization can make the leap.
In this ebook, you’ll learn about the profound strategic potential in an event streaming platform for enterprise businesses of many kinds. The types of business challenges event streaming is capable of addressing include driving better customer experience, reducing costs, mitigating risk, and providing a single source of truth across the business. It can be a game changer.
This document provides an overview of Confluent and Snowflake’s integration, a detailed tutorial for getting started with the integration, and unique considerations to keep in mind when working with these two technologies.
Learn how CDC (Change Data Capture) captures database transactions for ingest into Confluent Platform to enable real-time data pipelines.
Dive into full Kafka examples, with connector configurations and Kafka Streams code, that demonstrate different data formats and SerDes combinations for building event streaming pipelines.
Learn about the components of Confluent Enterprise, key considerations for production deployments, and guidelines for selecting hardware or deployment with different cloud providers.
Learn why organizations are considering Apache Kafka to streamline cloud migrations.
In this book, O’Reilly author Martin Kleppmann shows you how stream processing can make your data processing systems more flexible and less complex.
The reference architecture provides a detailed architecture for deploying Confluent Platform on Kubernetes and uses the Helm Charts for Confluent Platform as a reference to illustrate configuration and deployment practices.
In this white paper, we offer recommendations and best practices for designing data architectures that will work well with Confluent Cloud.
Learn the challenges of traditional messaging middleware, hindering innovation, low fault tolerance at scale, ephemeral persistence limiting data usage for analytics, and soaring technical debt and operational costs.
Shoe retail titan NewLimits is drowning in stale, inconsistent data due to nightly batch jobs that keep failing. Read the comic to see how developer Ada and architect Jax navigate through Batchland with Iris, their guide, and enter Streamscape and the realm of event-driven architectures.
Shoe retail titan NewLimits is drowning in stale, inconsistent data due to nightly batch jobs that keep failing. Read the comic to see how developer Ada and architect Jax navigate through Batchland with Iris, their guide, and enter Streamscape and the realm of event-driven architectures.
This ebook will show you how to make your highly valuable data available at scale, everywhere it needs to be, while keeping it secure and compliant.
Complete guide to migrate from open-source (OSS) Apache Kafka to Confluent. This includes best practices & customer-success stories on personal migration journeys.
In this white paper, you will learn how you can monitor your Apache Kafka deployments like a pro, the 7 common questions you'll need to answer, what requirements to look for in a monitoring solution and key advantages of the Confluent Control Center.
Learn about 6 common Kafka challenges that cause enterprise projects to fail, and how to overcome the disadvantages of running, managing, securing, and scaling Kafka.
This whitepaper is an in-depth guide to building streaming pipelines to data warehouses. Covers source and sink connectors (with Change Data Capture capabilities), stream processing with Kafka Streams and ksqlDB, with use cases and operational considerations.
This whitepaper is an in-depth guide to building streaming pipelines between different databases (RDBMS). Covers source and sink connectors (with Change Data Capture capabilities), stream processing with Kafka Streams and ksqlDB, with use cases and operational considerations.
Processing large amounts of data is challenging due to cost, physical size, efficiency, and availability limitations most companies face. A scalable and highly-available back-end system such as Confluent can efficiently process your company’s ever-growing volume of data.
This white paper unpacks the true costs of open source Kafka and MSK and demonstrates the value you can realize using Confluent.
Forrester says Confluent is a “Streaming force to be reckoned with,” and has named Confluent a leader in The Forrester Wave™: Cloud Data Pipelines, Q4 2023. See why Confluent is a leader.
Forrester says Confluent is a “Streaming force to be reckoned with,” and has named Confluent a leader in The Forrester Wave™: Streaming Data Platforms, Q4 2023. See why Confluent is a leader.
Forrester says Confluent is a “Streaming force to be reckoned with,” and has named Confluent a leader in The Forrester Wave™: Streaming Data Platforms, Q4 2023. See why Confluent is a leader.
Learn how Confluent's fully managed, cloud-native Kafka powers enterprise-grade data streaming, integration, and governance for modern banking and financial services use cases.
We’ve put together a decision tree that will help you evaluate your current data streaming setup and trajectory to assess whether a fully managed data streaming platform is a good fit for you.
In this IDC Tech Brief, we share our research on streaming data platforms, and the advantages they’re bringing for innovation, improved operational efficiency, ROI, and more.
The modern world is defined by speed. Grocery delivery, rideshare apps, and payments for just about anything can happen instantly using a mobile device and its apps. Every action of every consumer creates data, and businesses must make sense of it quickly to take advantage in real time.
This Ventana Research Analyst Perspective explains why organizations have to manage and govern data streaming projects alongside data at rest.
Download the “Transform Your Data Pipelines, Transform Your Business: 3 Ways to Get Started” ebook to take a deep dive into the challenges associated with legacy data pipelines and how streaming pipelines can help you reinvent the way data flows through—and is accessed—in your organization.
Learn how event-driven architecture and stream processing tools such as Apache Kafka can help you build business-critical systems that open modern, innovative use cases.
Confluent named a Leader in the IDC MarketScape for Worldwide Analytic Stream Processing Software 2024. See why Confluent is a Leader.
Confluent named a Leader in the IDC MarketScape for Worldwide Event Brokering Software 2024. See why Confluent is a Leader.
Shoe retail titan NewLimits is drowning in stale, inconsistent data due to nightly batch jobs that keep failing. Read the comic to see how developer Ada and architect Jax navigate through Batchland with Iris, their guide, and enter Streamscape and the realm of event-driven architectures.
Differentiating cloud-native, cloud, and cloud services, and lessons learned building a fully managed, elastic, cloud-native Apache Kafka.
Discover the latest Apache Flink developments and major Confluent announcements from Kafka Summit 2023 in 451 Research’s Market Insight Report.
This whitepaper covers how to implement a real-time fraud detection solution, covering multi-channel detection and real-time data integration, real-time processing, machine learning and AI, and real-time monitoring, reporting, and analytics.
In our ebook “Putting Fraud In Context”, we explore the complexities of fraud detection, why current detection tools often fall short and how Confluent can help.
Learn how Confluent can simplify and accelerate your migration to Amazon Aurora
Top streaming data use cases powering leading financial services organizations, like Citigroup and 10x Banking, with real-time payments, fraud detection, and better customer experiences.
Recognizing the need for real-time data while understanding the burden of self-managing Kafka on their own led BigCommerce to choose Confluent—allowing them to tap into data streaming without having to manage and maintain the data infrastructure.
Our latest eBook explores legacy data pipelines challenges—and how streaming pipelines and existing tech partners can help you optimize how data flows through your company, and make it more accessible throughout your organization.
Mainframes play a fundamental role in many organizations, but can be expensive to operate. Discover how Confluent's data streaming technology can help reduce MIPS and lower costs, with real-world case studies and example architectures.
A data mesh is useful for military space operations for numerous reasons including improving data quality, enabling data access and sharing while maintaining security and access controls, and supporting better decision-making.
Confluent is uniquely positioned to help agencies reframe how they approach the responsibility for and the coordination of cyber defense and resilience.
With Confluent, USDA can deploy across on-prem and cloud environments so the different Mission Areas can continue to manage their data as they need. It creates a flexible and future-ready data infrastructure by decoupling producers and consumers, simplifying how data can be combined in new ways.
Confluent Platform completes the event streaming platform and adds the flexibility, durability, and security required for complex, large-scale mission operations.
A data mesh architecture helps address all eight guiding principles in the DoD Data Strategy from viewing data as a strategic asset to collective stewardship and enterprise access to being designed for compliance.
Data Centralization enables algorithms to work more effectively, with access to more information and working at the speed of machines to provide deeper insight in near real time.
With ABAC, authorization occurs at a granular level less than the topic in the stream. The requirement restricts access to fields within the event based on attribute types, combinations, and user roles.
Data streaming can be applied to nearly any citizen service (e.g., permit applications, financial aid, pensions, medical claims, immigration processing, tax filing), becoming increasingly powerful when government agencies use the same data sources across multiple applications.
Data mesh architectures help bridge the gap between the systems we have and the decisions we need to support.
How Confluent helps meet the Executive Order requirement for event forwarding and event log management in collecting, aggregating, routing, and sharing data.
Insights on streaming data from the General Services Administration (GSA), NASA, Air Force, and the Federal Energy Regulatory Commission.
The solution is better data-in-motion architectures that focus on harnessing the flow of data across applications, databases, Software as a Service (SaaS), layers and cloud systems.
Confluent enables government organizations to easily inject legacy data sources into new, modern applications and adapt to changing real-world circumstances faster than ever.
As the DoD continues to invest in DevSecOps as a culture and approach to rapidly meeting the warfighter’s needs, it needs secure yet widely available access to cloud-native infrastructure.
Event streaming puts data in motion and creates a central nervous system for your entire organization, creating a new paradigm that supports collecting a continuous flow of data throughout an organization and processing it in real time.
Data streaming enables organizations to put data sharing in motion. The sharing organization publishes a stream of events (including changes and deltas) as they occur and data sharing consumers can subscribe to efficiently receive them as they happen.
Confluent aligns closely with the goals of the Data Strategy’s principle of Conscious Design, harnessing existing data and protecting its quality and relevance, allowing agencies to be more responsive to constituent needs with modern services.
Confluent's data streaming platform enables government entities to transform the way they work with data to protect the public, improve infrastructure, manage transportation, and more.
Kafka management becomes risky and costly as it scales. Learn why Confluent reinvented Kafka as a cloud service for over 10X more elasticity, storage, and resiliency.
Download the “Kafka In the Cloud: Why It’s 10x Better With Confluent” ebook to take a deep dive into how Confluent harnessed the power of the cloud to build a data streaming platform that’s 10x better than Apache Kafka, so you can leave your Kafka management woes behind.
Confluent enables government agencies to utilize data as a continually updating stream of events, rather than discrete snapshots. Run your agency by building real-time applications with historical context - all based on a universal event pipeline.
In this report, you’ll learn how to use event streaming to process, store, analyze and act on both historical and real-time data in one place. You'll also explore: Data access and management challenges agencies are facing and how to address them.
The Confluent event-streaming platform enables government organizations to unlock and repurpose their existing data for countless modern applications and use cases.
This whitepaper describes some of the financial businesses that rely on Confluent and the game-changing business outcomes that can be realized by using data streaming technology
As the DoD presses forward with Joint All-Domain Command and Control (JADC2) programs and architectures the Air Force is working to stand up technology centers that will not only allow for the sharing of data but for the sharing of data in motion.
Data streaming provides an accurate, real-time view of your business. Learn about the data streaming ecosystem, its benefits, and how to accelerate real-time insights and analytics in this guide.
Building a cloud-native data streaming platform isn’t just hosting Kafka on the cloud. We documented our design of Kora, the Apache Kafka engine built for the cloud, and were awarded “Best Industry Paper” at Very Large Data Bases (VLDB), one of the most prestigious tech conferences.
Why an event-driven data mesh built on Apache Kafka provides the best way to access important business data and unify the operational and analytical planes.
This whitepaper outlines the most common patterns and considerations for Mainframe Integration projects.
Many businesses are using streaming data in some form—but not necessarily effectively. As the volume and variety of data streams increases, data and analytics leaders should evaluate the design patterns, architectures, and vendors involved in data streaming technology to find relevant opportunities.
Taking Kafka to the cloud? Learn 3 best practices for building a cloud-native system that makes data streaming scalable, reliable, and cost-effective.
Learn about 5 challenges of legacy systems and why your organization should move its data infrastructure and Apache Kafka use cases to the cloud.
Modern customers crave personalization. How do banks deliver on it? By leveraging real-time data—enabled by data streaming platforms—to unlock powerful customer experiences.
Should you spend time self-managing open source technologies such as Apache Kafka® (build), or invest in a managed service (buy)? Let’s evaluate!
Modern fraud technology calls for a modern fraud detection approach, and that requires real-time data. Industry leaders from Capital One, RBC, and more are detecting fraud using data streaming to protect customers in real time.
To succeed, retailers must unify data scattered across point-of-sale, e-commerce, ERP, and other systems. Without integrating all of this data in motion—and making it available to applications in real time—it’s almost impossible to deliver a fully connected omnichannel customer experience.
Confluent can help you build data streaming pipelines that allow you to connect, process, and govern any data stream for any data warehouse.
Ventana Research finds that more than nine in ten organizations place a high priority on speeding the flow of data across their business and improving the responsiveness of their organizations. This is where Confluent comes in.
Read GigaOm’s ease-of-use study on self-managed Apache Kafka® and fully managed Confluent Cloud. See how Confluent accelerates and streamlines development.
Confluent is 10X better than Apache Kafka so you can cost-effectively build real-time applications on Google Cloud
Confluent is 10X better than Apache Kafka so you can cost-effectively build real-time applications on Microsoft Azure.
Explore new ways that your organization can thrive with a data-in-motion approach by downloading the new e-book, Harness Data in Motion Within a Hybrid and Multicloud Architecture.
An overview of Confluent’s Core Product Pillars.
How Sainsbury’s is revolutionizing its supply chain with real time data streaming from Confluent.
Optimize your SIEM to Build Tomorrow’s Cyber Defense with Confluent
To learn more about the E2E Encryption Accelerator and how it may be used to address your data protection requirements, download the Confluent E2E Encryption Accelerator white paper.
In 2022, if you want to deliver high-value projects that drive competitive advantage or business differentiation quickly, your best people can’t be stuck in the day-to-day management of Kafka, and your budget is better spent on your core business. By now you know, the answer is cloud.
Introduction to serverless, how it works, and the benefits stateful serverless architectures provide when paired with data streaming technologies.
To learn more about how you can implement a real-time data platform that connects all parts of your global business, download this free Confluent hybrid and multicloud reference architecture.
Check out IDC’s findings on why & how building resiliency matters in the face of near-constant disruption. To build resiliency, businesses should focus on one key area: their data.
Find out more in IDC’s From Data at Rest to Data in Motion: A Shift to Continuous Delivery of Value.
This eBook will explain how you can modernize your data architecture with a real-time, global data plane that eliminates the need for point-to-point connections and makes your data architecture simpler, faster, more resilient, and more cost effective.
This IDC Market Note discusses the main takeaways from the 2022 Kafka Summit in London, hosted by Confluent.
The secret to modernizing monoliths and scaling microservices across your organization? An event-driven architecture.
The companies most successful in meeting the demanding expectations of today’s customers are running on top of a constant supply of real-time event streams and continuous real-time processing. If you aspire to join the ranks of those capitalizing on data in motion, this is the place to start.
Download this white paper to read how Confluent can power the infrastructure necessary to run Autonomous Networks.
The companies most successful in meeting the demanding expectations of today’s customers are running on top of a constant supply of real-time event streams and continuous real-time processing. If you aspire to join the ranks of those capitalizing on data in motion, this is the place to start.
In this paper, we explore some of the fundamental concepts of Apache Kafka, the foundation of Confluent Platform, and compare it to traditional message-oriented middleware.
This ENTERPRISE MANAGEMENT ASSOCIATES® (EMA™) eBook will show how, with fully managed cloud-based event streaming, executives, managers, and individual contributors gain access to real-time intelligence and the enterprise will achieve unprecedented momentum and material gain.
In this eBook from Confluent and AWS, discover when and how to deploy Apache Kafka on your enterprise to harness your data, respond in real-time, and make faster, more informed decisions.
From data collection at scale to data processing in the Cloud or at the Edge—IoT architectures and data can provide enormous advantages through useful business and operational insights.
Confluent is pioneering a new category of data infrastructure focused on data in motion, designed to be the intelligent connective tissue enabling real-time data, from multiple sources, to constantly and securely stream across any organization.
Confluent’s platform for data in motion unifies silos and sets data in motion across an organization. Learn how this empowers developers to build the kinds of real-time applications that make their organizations more competitive and more efficient.
Discover how to fuel Kafka-enabled analytics use cases—including real-time customer predictions, supply chain optimization, and operational reporting—with a real-time flow of data.
Confluent Platform completes Kafka with a set of enterprise-grade features and services. Confluent Platform can reduce your Kafka TCO by up to 40% and accelerate your time to value for new data in motion use cases by 6+ months. Learn how Confluent Platform drives these outcomes for our customers.
For financial services companies, digital technologies can solve business problems, drastically improve traditional processes, modernize middleware and front-end infrastructure, improve operational efficiency, and most importantly, better serve customers.
Banks and financial institutions are looking toward a future in which most business is transacted digitally. They’re adding new, always-on digital services, using artificial intelligence (AI) to power a new class of real-time applications, and automating back-office processes.
Banking customers today demand personalized service and expect real-time insight into their accounts from any device—and not just during “business hours.” Financial institutions trying to meet those expectations have intense competition from each other as well as fintech startups...
Download this whitepaper to learn about ksqlDB, one of the most critical components of Confluent, that enables you to build complete stream processing applications with just a few simple SQL queries.
In this white paper, you’ll learn about five Kafka elements that deserve closer attention, either because they significantly improve upon the behavior of their predecessors, because they are easy to overlook or to make assumptions about, or simply because they are extremely useful.
This paper presents Apache Kafka’s core design for stream processing, which relies on its persistent log architecture as the storage and inter-processor communication layers to achieve correctness guarantees.
This white paper explores the potential benefits and relevance of deploying Confluent with the Istio service mesh.
Learn Kubernetes terms, concepts and considerations, as well as best practices for deploying Apache Kafka on Kubernetes.
This white paper reports the results of benchmarks we ran on a 2-CKU multi-zone dedicated cluster and shows the ability of a CKU to deliver the stated client bandwidth on AWS, GCP, and Azure clouds.
If you’re a leader in a business that could or does benefit from automation, IoT, and real-time data, don’t miss this white paper. The lifeblood of Industry 4.0 is streaming data, which is where event streaming comes in: the real-time capture, processing, and management of all your data in order to drive transformative technology initiatives.
This brief describes how to enable operational data flows with NoSQL and Kafka, in partnership with Couchbase and Confluent.
This paper provides 10 principles for streaming services, a list of items to be mindful of when designing and building a microservices system
The IDC Perspective on Confluent Platform 6.0 is here, and in it, you can read IDC’s lens on the importance of event streaming to enterprise companies today.
We used to talk about the world’s collective data in terms of terabytes. Now, according to IDC’s latest Global Datasphere, we talk in terms of zettabtytes: 138Z of new data will be created in 2024—and 24% of it will be real-time data. How important is real-time streaming data to enterprise organizations? If they want to respond at the speed of business, it’s crucial. In this digital economy, having a competitive advantage requires using data to support quicker decision-making, streamlined operations, and optimized customer experiences. Those things all come from data.
This white paper outlines the integration of Confluent Enterprise with the Microsoft Azure Cloud Platform.
This brief describes a modern data architecture with Kafka and MongoDB
The survey of the Apache Kafka community shows how and why companies are adopting streaming platforms to build event-driven architectures.
This brief describes a comprehensive streaming analytics platform for visualizing real-time data with Altiar Panopticon and Confluent Platform.
This paper will guide developers who want to build an integration or connector and outlines the criteria used for Confluent to verify the integration.
Read this white paper to learn about the common use cases Confluent is seeing amongst its financial services customers.
Ensure that only authorized clients have appropriate access to system resources by using RBAC with Kafka Connect.
Confluent Cloud is the industry's only cloud-native, fully managed event streaming platform powered by Apache Kafka.
Best practices for developing a connector using Kafka Connect APIs.
This brief describes a solution for data integration and replication in real time and continuously into Kafka, in partnership with HVR and Confluent.
This brief describes a solution to efficiently prepare data streams for Kafka and Confluent with Qlik Data Integration for CDC Streaming.
This reference architecture documents the MongoDB and Confluent integration including detailed tutorials for getting started with the integration, guidelines for deployment, and unique considerations to keep in mind when working with these two technologies.
In this three-day hands-on course, you will learn how to build, manage, and monitor clusters using industry best-practices developed by the world’s foremost Apache Kafka experts.
Most insurance companies today are somewhere along the spectrum of digital transformation, finding new ways to use data while staying within the confines of strict regulatory complexity and capital requirements. But only a few insurtech leaders and innovative startups have really tapped into real-time streaming data as the architecture behind these efforts. In this free ebook, learn about three pivotal insurance business uses for event streaming: reducing operating costs with automated digital experiences, personalizing the customer experience, and mitigating risks with real-time fraud and security analytics.
This survey focuses on why and how companies are using Apache Kafka and streaming data and the impact it has on their business.
Get key research stats on why CIOs are turning to streaming data for a competitive advantage.
This brief describes a modern datacenter to manage the velocity and variety of data with an event driven enterprise architecture with DataStax and Confluentj
In this ebook, you’ll learn about the adoption curve of event streaming and how to gain momentum and effect change within your organization. Learn how to wield event streaming to convert your enterprise to a real-time digital business, responsive to customers and able to create business outcomes in ways never before possible.
In this paper, we introduce the Dual Streaming Model. The model presents the result of an operator as a stream of successive updates, which induces a duality of results and streams.
In this three-day hands-on course you will learn how to build an application that can publish data to, and subscribe to data from, an Apache Kafka cluster.
This brief describes a solution with Neo4js graph database and Confluent Platform.
This brief describes a solution for real-time data streaming with ScyllaDB's NoSQL database paired with Confluent Platform.
Confluent implements layered security controls designed to protect and secure Confluent Cloud customer data, incorporating multiple logical and physical security controls that include access management, least privilege, strong authentication, logging and monitoring, vulnerability management, and bug bounty programs.
This brief describes streaming data analysis and visualization accelerated by Kinetica's GPU in-memory technology, in partnership with Confluent.
Use cases for streaming platforms vary from improving the customer experience - we have synthesized some common themes of streaming maturity and have identified five stages of adoption
This brief describes an end-to-end streaming analytics solution with Imply, Druid providing the data querying and visualizations and Kafka data streaming.
Spending time with many OEMs and suppliers as well as technology vendors in the IoT segment, Kai Waehner gives an overview on current challenges in the automotive industry and on a variety of use cases for event-driven architectures.
Driving Efficiency & Innovation Across Retail, Telco, Manufacturing, and More
Accenture & Confluent present: Open Standards for Data Lineage. Explore Data Lineage's Future!
Join us for an exclusive look at how BMW Group transformed their omnichannel experience by expanding into direct-to-consumer operations with a data streaming platform.
Learn how a real-time data streaming platform brought Thrivent’s data closer to customers for a seamless digital experience, no matter what channel they use.
Embark on a journey to Kafka success! Join our exclusive webinar, "Top 6 Reasons Kafka Projects Fail and How to Overcome Them," now on-demand.
Join us for an exclusive event focused on leveraging Flink for data streaming in financial services using Confluent’s fully managed Flink capabilities. Discover how Confluent’s data streaming platform and Flink are revolutionizing the financial industry by enabling real-time business insights, systems like fraud detection, and even how you could integrate Flink+Confluent in a LLM workflow.
Learn how to build a data mesh on Confluent Cloud by understanding, accessing, and enriching your real-time Kafka data streams using Stream Governance. Confluent’s product team will demo the latest features and enhanced capabilities, along with showing how you can get access in a few clicks.
Join this webinar to explore Kafka's core concepts, learn how to seamlessly integrate it into your environment, and understand how it stands apart from the Confluent Platform.
In this GenAI tutorial webinar by Confluent and MongoDB, you’ll learn how to build retrieval-augmented generation (RAG) in 4 key steps: data augmentation, inference, workflows, and post-processing. See a step-by-step walkthrough of vector embedding and get all your questions answered in a live Q&A.
This Shift Left demo showcases how you can harness the power of a Data Streaming Platform (DSP) to clean and govern data at the time it is created, and deliver fresh trustworthy data to your data warehouse and data lake to maximize the ROI of your Snowflake or Databricks environments.
Join this webinar to see how the legacy of traditional implementations still impacts microservice architectures today.
In this demo webinar, you’ll learn about building a real-time knowledge base for RAG architecture. We’ll show you how to configure a source connector to bring in data from various sources, Flink SQL for vector embedding, and sink connector to send data to vector stores like MongoDB Atlas Vector search.
Financial institutions use voice biometrics for security, but AI's growth, especially in voice cloning, challenges their effectiveness. With advancing voice cloning AI, testing the resilience of these systems is vital. Learn how GoodLabs Studio developed their Vishing Penetration Testing System with Confluent, Flink, Genesys, an LLM, and ElevenLabs integration.
Join us on August 15th for an exclusive webinar featuring Jay Kreps, Co-founder and CEO of Confluent. Discover the insights from our 2024 Data Streaming Report and learn why 94% of IT leaders in the APAC region believe data streaming platforms are important for achieving their data goals.
In this webinar, you’ll learn about the latest security tools and features in Confluent Platform designed to protect your streaming Kafka data across on-premises or hybrid environments, so you can launch faster while upholding strict security and compliance requirements.
In just five months, BigCommerce migrated 1.6 billion events a day to Confluent, and saved 20 hours a week in Kafka management, along with the scalability savings gained from no longer having to over provision clusters to accommodate seasonal traffic spikes.
Demo Webinar: Explore Kora Engine, Enterprise Clusters, and more in our Q3 Confluent Cloud Launch.
This fireside chat will cover Suman’s learnings from implementing 2 critical use cases at Walmart that continue to play a critical role in customer satisfaction: real-time inventory and real-time replenishment.
Join this webinar, where we’ll take you on a tour of how we re-architected the inner workings of Apache Kafka® to build Kora Engine.
In this session you'll learn how to modernize your messaging workloads. This demo shows how to connect your on-premises messaging systems to Confluent Platform.
Developing a streaming solution working against a self-managed Kafka cluster, can be awkward and time consuming, largely due to security requirements and configuration red-tape. It's beneficial to use Confluent Cloud in the early stages to get quick progress. Creating the cluster in Confluent Cloud is super easy and allows you to concentrate on defining your Connect sources and sinks as well as fleshing out the streaming topology on your laptop. It also shows the client how easy it is to swap out the self-managed Kafka cluster with Confluent Cloud.
Providing a seamless digital experience is what our customers have come to expect. As we step into 2022, our businesses are being challenged to evolve even further to maintain that competitive edge.
In this online talk, you’ll hear about ingesting your Kafka streams into Imply’s scalable analytic engine and gaining real-time insights via a modern user interface.
Discover how Michelin harnessed Apache Kafka and Confluent Cloud to transform their supply chain, saving on operational costs and unlocking real-time insights.
Join Unity, Confluent and GCP to learn how to reduce risk and increase business options with a hybrid cloud strategy.
In this session, we will cover the easiest ways to start developing event-driven applications with Apache Kafka using Confluent Platform.
Guest speakers from Forrester, Siemens, and Amazon Web Services (AWS) talk about why data streaming platforms are essential to businesses everywhere and are the key to turning your data mess of batch-oriented, point-to-point connections into data value.
Join this webinar to learn how Capital One, one of the biggest U.S. banks, successfully manages real-time data streaming across the enterprise to deliver differential value to over 100 million customers.
You’ll walk away with an understanding of how to modernize your SIEM architecture for higher throughput, lower latency, and more cost efficiency. You’ll also be able to run the demo and explore a series of hands-on labs for yourself and dig into the technical details.
Maygol will walk us through the new streaming data pipeline demo that showcases a finserv use case showcasing streaming data pipelines between Oracle database and RabbitMQ on-prem systems, to migrate data to MongoDB on the cloud.
In an era of rapid digital transformation, harnessing data effectively is paramount. Discover how Kafka's real-time capabilities revolutionize data management, empowering Energy & Utilities companies to optimize operations, enhance grid reliability, and drive sustainable innovation.
Unlock invaluable insights from Uniper, a trailblazer in the industry, as they share firsthand experiences and best practices. Whether you're a seasoned professional or new to the field, this webinar promises illuminating discussions and actionable strategies to propel your organization forward.
Learn how to build a real-time, contextualized, and trustworthy knowledge base for your GenAI applications. Leverage data streaming and Apache Flink® stream processing for LLM-RAG architectures. See a RAG demo and get your questions answered.
This panel of industry experts discuss their everyday usage of data streaming within their company, how they got there, and what use cases they will be focussing on further down the road of real-time.
Join our webinar to learn best practices for constructing a data mesh. Explore optimizing data pipelines for this architecture and the transformative role of streaming pipelines in creating a democratized data marketplace.
In this 30-minute session, hear from top Kafka experts who will show you how to easily create your own Kafka cluster and use out-of-the-box components like ksqlDB to rapidly develop event streaming applications.
In this two-hour spooktacular workshop with Bruce Springstreams, learn about event-driven microservices with Spring BOOOOt and Confluent Cloud.
Replace the mainframe with new applications using modern and less costly technologies. Stand up to the dinosaur, but keep in mind that legacy migration is a journey. This session will guide you to the next step of your company’s evolution!
A company's journey to the cloud often starts with the discovery of a new use case or need for a new application. Deploying Confluent Cloud, a fully managed cloud-native streaming service based on Apache Kafka, enables organisations to revolutionise the way they build streaming applications and real-time data pipelines.
Adjusting to the real-time needs of your mission-critical apps is only possible with an architecture that scales elastically. Confluent re-engineered Apache Kafka into an elastically scalable, next-gen event streaming platform that processes real-time data wherever it lives - making it accessible for any budget or use case.
Capital One supports interactions with real-time streaming transactional data using Apache Kafka®. Join us for this online talk on lessons learned, best practices and technical patterns of Capital One’s deployment of Apache Kafka.
In this online talk, we introduce Apache Kafka® and the MongoDB connector for Kafka, and demonstrate a real world stock trading use case that joins heterogeneous data sources to find the moving average of securities using Apache Kafka and MongoDB.
View this webinar with Confluent and Microsoft experts to:
Stream processing is a data processing technology used to collect, store, and manage continuous streams of data as it’s produced or received. Also known as event streaming or complex event processing (CEP), stream processing has grown exponentially in recent years due to its powerful...
In today’s fast-paced digital world, customers want businesses to anticipate their needs in real time. To meet these heightened expectations, organizations are using Apache Kafka®, a modern, real-time data streaming platform.
Join our webinar to explore how Confluent's data streaming platform, Azure data services, and Azure OpenAI deliver real-time insights and help customers conduct seamless business transactions.
As the year draws to a close, we invite you to join us for a special event that reflects on the best moments of 2023 and provides a glimpse into the future of data and innovation.
In this fireside chat you’ll hear from Dollar General’s Head of Merchandising and Supply Chain Engineering on why and how the organization has adopted data streaming, benefits seen so far, and tactical recommendations on how other organizations can adopt similar use cases.
In this webinar, we will walk you through two product demos to ensure you’re ready for ZooKeeper-less Kafka: one on how to get started with KRaft and the second on how to migrate to KRaft from an existing deployment.
In this demo webinar, you’ll learn about Confluent’s connector portfolio and how it can enable seamless, reliable, and secure integrations with your source and sink data systems. We’ll show you how to set up secure networking, configure popular connectors, and leverage other productivity features.
Check out our webinar with McAfee, where you’ll get first-hand insight into how the cybersecurity giant replatformed its architecture to reap the benefits of real-time data streaming with a fully managed cloud service.
Join us for our monthly webinar series, "Streaming Use Case Showcase," and discover how industry leaders, customers, and partners leverage Confluent's cutting-edge technology to revolutionise IT and achieve unprecedented business outcomes.
This is a three-part series which introduces key concepts, use cases, and best practices for finding success with event-driven microservices. Each session is recorded so if you missed a session you’ll have a chance to watch on-demand.
Get ready for an exclusive opportunity to immerse yourself in the transformative world of Kafka Summit London! Whether you're a tech enthusiast, a business leader, or simply curious about the forefront of data streaming, this series is designed for you.
In this webinar, you’ll get a detailed overview of what’s new with our fully managed Flink service. You’ll also see a technical demo that incorporates all of the latest Flink enhancements on Confluent Cloud, including Actions, Stream Lineage integration, and more.
See how Extend pairs Confluent's data streaming platform with AWS serverless services to build scalable data applications.
During this webinar, learn how to simplify event-driven, serverless architectures with Confluent and AWS Lambda. You'll see how to scale seamlessly, integrate AWS Lambda with Confluent, and build apps faster with Confluent’s robust connector ecosystem.
Learn about the new key features in the Confluent Cloud Q1 2023 launch - Centralized Identity Management (OAuth), Enhanced RBAC, Client Quotas and more that enable you to build a secured shared services data streaming platform.
Listen to this webinar to earn how to build and deploy data pipelines faster while combining and enriching data in motion with Confluent & Azure Cosmos DB.
In this webinar, Dan Rosanova, Group Product Manager at Confluent, will cover:
In this webinar, we’ll walk through how to build streaming data pipelines to Databricks across on-prem and cloud environments using Confluent and our ecosystem of pre-built connectors, with ksqlDB for real-time stream processing.
Join us for this webinar where we will discuss the challenges and benefits of a cloud migration and how our partner, Improving, can help you simplify and navigate the process with confidence.
Watch this webinar for an opportunity to hear from the thought leaders of Kafka and Apache Druid on how Confluent Cloud and Imply Polaris enable customers to leverage the power of interactive streaming platforms to accelerate real time data analytics.
Learn how to build a data mesh on Confluent Cloud by understanding, accessing, and enriching your real-time Kafka data streams using Stream Governance. Confluent’s product team will demo the latest features and enhanced capabilities, along with showing how you can get access in a few clicks.
Learn how to ensure high data quality, discoverability, and compatibility for your real-time data streams on Confluent Cloud using Stream Governance. Confluent’s Stream Governance team will demo the latest features and enhanced capabilities, along with showing how you can upgrade in a few clicks.
Learn the benefits of data mesh, how to best scale your data architecture, empower real-time data governance, and best practices from experts at Confluent and Microsoft.
Learn in this demo-driven webinar best practices for building a resilient Kafka deployment with Confluent Platform to ensure continuous operations and automatic failover in case of an outage.
Join Kai Waehner, Global Field CTO at Confluent, to explore the latest financial services trends and learn how data streaming helps modernize legacy architectures to enable digitalization in regulated industries.
In this hands-on session you’ll learn about Custom Connectors for connecting to any data system or apps without needing to manage Kafka Connect infrastructure. We’ll show you how to upload your connector plugin, configure the connector, and monitor the logs and metrics pages ensure high performance.
In this hands-on session you’ll learn about Custom Connectors for connecting to any data system or apps without needing to manage Kafka Connect infrastructure. We’ll show you how to upload your connector plugin, configure the connector, and monitor the logs and metrics pages ensure high performance.
Your data streaming platform needs to be truly elastic to match with customer demand. It should scale up with your business’s peak traffic, and back down as demand shrinks.
In this demo webinar, you will learn about our new Apache Flink on Confluent Cloud, the industry’s only cloud-native, serverless Flink service for stream processing alongside other new innovations on Confluent Cloud.
Explore the state of data streaming for the logistics sector, where digital logistics and real-time capabilities are a core segment for investments and data consistency is crucial!
Learn how companies will leverage event-driven data streaming, Apache Kafka, Confluent, and other tools to meet the demands of real-time markets, increased regulations, heightened customer expectations, and much more.
Register now to attend this informative online talk and discover Kai’s top five cutting-edge use cases and architectures that are at the forefront of real-time data streaming initiatives.
What is data mesh and why is it gaining rapid traction among data teams?
Join us on May 17 to talk with Michele Goetz, VP, Principal Analyst at Forrester and Raiffeisen Bank International for a deep dive.
Straight from the event floor to you, DIMT Reflections - A Data in Motion Recap discussion series showcases the best of the best content from the Data in Motion tour. Giving you access to conversations and content from the best the APAC region has to offer.
Learn how Confluent helps you manage Apache Kafka® — without its complexity.
Join Confluent for the opportunity to hear from customers, network with your peers and ecosystem partners, learn from Kafka experts, and roll up your sleeves with interactive demonstrations.
This panel of industry experts discuss their everyday usage of data streaming within their company, how they got there, and what use cases they will be focussing on further down the road of real-time.
Data Streaming with Apache Kafka and Apache Flink is one of the world's most relevant and talked about paradigms in technology. With the buzz around this technology growing, join Kai Waehner, Global Field CTO at Confluent, to hear his predictions for the 'Top Five Trends for Data Streaming in 2024'.
Explore general trends like customer-driven in-store experiences, social & locality platforms, and how data streaming helps modernize and innovate customer experiences as well as operational efficiencies.
Explore the latest data streaming trends and architectures, including edge, datacenter, hybrid, and multicloud solutions.
Explore the state of data streaming in the insurance industry, which constantly needs innovation due to changing market environments and changes in customer expectations.
Explore the latest data streaming trends and architectures, including edge, datacenter, hybrid, and multicloud solutions.
Explore the latest financial services trends and learn how data streaming helps modernize legacy architectures to enable digitalization in regulated industries.
Explore general trends and how data streaming architectures are used, included edge, data center, hybrid and multi cloud helps to modernize and innovate the industry.
Explore general trends like customer-driven in-store experiences, social & locality platforms, and how data streaming helps modernize and innovate customer experiences as well as operational efficiencies.
Explore general trends like software-defined manufacturing and how data streaming helps modernize and innovate the entire engineering and sales lifecycle.
Explore general trends like customer-driven in-store experiences, social & locality platforms, and how data streaming helps modernize and innovate customer experiences as well as operational efficiencies.
Learn how DISH’ Wireless scaled their data streaming platform to power a new smart 5G network and deliver next-gen apps and valuable network data products. Hear why DISH chose Confluent Cloud and delve deeper into their 5G architecture.
Explore the latest data streaming trends and architectures, including edge, datacenter, hybrid, and multicloud solutions.
In this demo, we will show you how to connect on-premises and multi-cloud data to Azure Cosmos DB, process that data in a stream before it reaches Azure Cosmos DB, and connect your Azure Cosmos DB data to any application.
Join us to hear how Confluent enables our customers to use real-time data processing against Apache Kafka®, leverage easy-to-use yet powerful interactive interfaces for stream processing, and build integration pipelines without needing to write code.
In this workshop session, you will follow along with an instructor as you walk through the design, build, and implementation process with a simple, hypothetical application using Confluent Cloud.
Join us to learn how to set up Confluent Cloud to provide a singular and global data plane connecting all of your systems, applications, datastores, and environments – regardless of whether systems are running on-prem, in the cloud, or both.
Join us to learn how to set up Confluent Cloud to provide a singular and global data plane connecting all of your systems, applications, datastores, and environments – regardless of whether systems are running on-prem, in the cloud, or both.
McAfee, a leader in online protection, recognized the need to transition from open-source Kafka for their cloud-native modernization effort. Learn how they drove a successful migration, secured leadership buy-in to support this shift, & discovered insights for crafting an effective Kafka strategy.
In this webinar, hear directly from KOR's CTO to learn why they chose Confluent over MSK to serve as a single source of truth for four decades’ worth of trade reporting data to stay in compliance with financial regulations.
In this hands-on session with Q&A, you’ll learn how to build streaming pipelines to connect, process, govern, and share real-time data flows for cloud databases. The demo shows how an ecommerce company uses streaming pipelines for Customer 360 and personalization.
In the world that real time analytics, cloud, event streaming and Kafka are hot topics, how does “Data In Motion” come into play? What are the core ideas behind and why is it a big deal to companies that are going through digital transformation?
Join Forrester analyst Mike Gualtieri and Albertsons Senior Director of Omni-Channel Architecture Nitin Saksena to hear how the market trends driving the adoption of data streaming and how Albertsons has implemented a plethora of real-time use cases to deliver differentiated customer experiences.
In this webinar, we’ll show you how to leverage Confluent Cloud and Google Cloud Platform products such as BigQuery to streamline your data in minutes, setting your data in motion.
How do you mobilize your data securely and cost effectively to power a global business in real time?
In this webinar, you'll learn about the new open preview of Confluent Cloud for Apache Flink®, a serverless Flink service for processing data in flight. Discover how to filter, join, and enrich data streams with Flink for high-performance stream processing at any scale.
In this webinar, you'll learn about the new open preview of Confluent Cloud for Apache Flink®, a serverless Flink service for processing data in flight. Discover how to filter, join, and enrich data streams with Flink for high-performance stream processing at any scale.
Learn from Vimeo on Creating Better, Faster Real-Time User Experiences at Massive Scale From batch ETL with a 1-day delay to building streaming data pipelines, learn how Vimeo unlocked real-time analytics and performance monitoring to optimize video experiences for 260M+ users.
In this hands-on session we’ll show how to enrich customer data with real-time product, order, and demographic data every time a new order is created. You’ll learn how to connect data sources, process data streams with ksqlDB, and govern streaming pipelines with Confluent.
Join the Confluent team for a webinar that delves into the most impactful data streaming use cases, leaving you clear on the value these can deliver to your business and how to get started with deploying a data streaming platform.
Kafka is a platform used to collect, store, and process streams of data at scale, with numerous use cases. Join us in this live, interactive session, to learn more about Apache Kafka.
Zookeeper removal, Confluent for Kubernetes, governance updates, cluster linking, and more! Join this demo webinar and see our product highlights from Confluent Platform 7.x.
The sessions covers the common use cases for SMTs when sending data to and from a Kafka cluster, including masking sensitive information, storing lineage data, remove unnecessary columns, and more.
Maximize the value of your SIEM platform: Make data streaming the entry point for your cyber data and deploy next-generation SIEM pipelines
Arsitektur aplikasi bergeser dari sistem perusahaan monolitik ke pendekatan yang fleksibel, dapat diskalakan, dan digerakkan oleh peristiwa. Selamat datang di era layanan mikro.
During this demo webinar, you’ll learn about the enterprise data streaming capabilities Confluent and SAP are building together. See how you can build real-time experiences with your SAP data at a lower cost with Confluent’s fully managed data streaming platform now integrated with SAP Datasphere.
In this hands-on session, you’ll learn how to integrate your IBM mainframe with Confluent in order to unlock Z System data for use in real-time, cloud-native applications.
Learn how Gen AI is transforming the way businesses approach their data strategy, offering real-time, context-driven content generation.
Leveraging Confluent’s fully managed, cloud-native service for Apache Kafka®, DriveCentric has been able to successfully transform and grow their business within a rapidly changing market.
Explore use cases, architectures, and success stories for data streaming in the aviation industry, including airlines, airports, global distribution systems (GDS), aircraft manufacturers, and more.
During this demo webinar, you’ll learn about the enterprise data streaming capabilities Confluent and SAP are building together. See how you can build real-time experiences with your SAP data at a lower cost with Confluent’s fully managed data streaming platform now integrated with SAP Datasphere.
GenAI has the potential to revolutionize industries by improving efficiency, reducing costs, and providing better outcomes for individuals and businesses.
Learn how Gen AI is transforming the way businesses approach their data strategy, offering real-time, context-driven content generation.
Demo webinar: See how real world gaming tech use cases like real-time player cheating detection are powered by Confluent’s 10x Kafka service for data streaming and AWS Lambda.
Modernize your database and move to the cloud by connecting multicloud and hybrid data to Amazon Aurora in real time.
Learn the latest cost and time-saving data estate modernization best practices with Azure and Confluent.
Join Lyndon Hedderly and Burhan Nazir of Confluent as they share their expertise on deploying enterprise-wide data streaming and accelerating the speed to realising measurable business value derived from a data-streaming investment.
This event-driven microservices webinar will have refreshed messaging around how event-driven microservices is an important use case for Confluent. Maygol will walk through a brand-new demo exclusively focused on an event-driven microservices use case.
Join us in our new Confluent Illustrated webinar as we present the fundamental aspects of data mesh and how to best put it into practice.
Experience a groundbreaking discussion led by James Golan, a solutions engineering expert at Confluent. In this comprehensive session, delve into the core concepts of data motion, where data powers digital experiences and businesses alike.
This 35-minute webinar is an overview of the Gartner Presentation How Data Streaming Makes Your Broader Data Strategy Successful.
It is a high-level overview of Greg’s Presentation with some added commentary on what is being spoken about at the summit.
Demo webinar: See how realJoin us for a captivating fireside chat with John Heaton, the visionary CTO of Alex Bank, as he reveals the transformative power of cutting-edge technology in banking.
Discover how Homepoint uses Confluent and Azure to Speed up Loan Processes
A 'how to' webinar in which Rojo outlines how to optimise the use of Apache Kafka in your SAP integration initiatives.
Pritha Mehra, CIO of United States Postal Service, spoke with Confluent co-founder Jun Rao, where she described how the postal service leveraged data streaming to send free COVID-19 test kits to all Americans in the height of the pandemic.
Video with Jason Schick: The strategy emphasizes the need for enterprise-wide data standards and coordination of data use across agencies, as well as using data to inform annual budget planning.
Hear our esteemed panel of experts address how to leverage information as a strategic asset.
Government agencies understand the need to augment traditional SIEM systems. And, with this knowledge comes the pressure to do so in a way that is better, faster, and cheaper than before.
Join Kai Waehner, Field CTO at Confluent, for an online talk in which he will explore the latest data in motion & Apache Kafka® use cases for the defence industry.
Data streaming is an infrastructure revolution that is fundamentally changing how public sector organisations think about data and build applications. Rather than viewing data as stored records or transient messages, data could be considered to be a continually updating stream of events.
Join this demo webinar to see how Confluent and Rockset power a critical architecture for efficiently developing and scaling AI applications built on real-time streaming data.
From batch to real time—learn about and see a demo on how to build streaming pipelines with CDC to stream and process data in real time, from an on-prem Oracle DB and cloud PostgreSQL to Snowflake.
Join us over 3 days from September 26th to 28th for the "Full Stream Ahead - Live from Current 2023" webinar series.
This webinar will walk through the story of a bank that uses an Oracle database to store sensitive customer information and RabbitMQ as the message broker for credit card transaction events.
Data streaming is an infrastructure revolution that is fundamentally changing how Departments and Agencies think about data and build applications.
In this webinar, learn how Confluent and AWS can help your company detect and combat financial fraud. Confluent's cloud-native data streaming platform gathers and analyzes transactional and event data in real time to prevent fraud, reduce losses, and protect your business from threats.
Demo webinar: Build a real-time analytics app to query and visualize critical observability metrics including latencies, error rates, and overall service health status. See how it’s done with Confluent Cloud and Imply Polaris, a fully managed Apache Druid® service.
Data governance is critical, but how do you govern data streams in real-time? Learn how ACERTUS drove faster microservices development, unlocked streaming data pipelines and real-time data integration across 4K+ schemas using Confluent’s Stream Governance.
Partner webinar: Meet with Confluent’s Kafka experts to build your step-by-step plan for integrating with Confluent Cloud and accelerating customer growth on your platform through real-time data streams.
Join us for a unique insider look into the complex world of fraud mitigation in banking. In this session, you will learn how Spain’s leading digital bank, Evo Banco, is paving the way in predictive fraud detection with data streaming and machine learning.
With fraud growing at exponential rates and costing financial firms billions of dollars in losses, we take a deeper look into the role timely data and real-time context plays in risk analytics and fraud mitigation.
Confluent Platform 7.4 enables Apache Kafka® to scale to millions of partitions. It simplifies architectures and accelerates time to market with self-service tooling and codified best practices for developers, ensuring consistent and accurate data.
Join this webinar to learn how Confluent Cloud relieves these operational considerations with infinite storage for Kafka that’s 10x more scalable and high-performing.
In this demo, you will learn how to transform your existing fraud tools with the power of real time data streaming and processing in Confluent. See how to easily connect, curate and integrate relevant data into your fraud systems to build a faster, smarter fraud detection solution.
In this webinar you will hear from industry experts on how real-time data streaming is advancing fraud detection and driving smarter decision-making across the Financial Services industry.
This demo webinar will provide everything you need to get started with the latest updates to Confluent Cloud, our cloud-native data streaming platform.
In this hands-on session with Q&A, you’ll learn how to build streaming data pipelines to connect, process, and govern real-time data flows for cloud databases. The demo shows a FinServ company using streaming pipelines for real-time fraud detection.
In this webinar, we’ll walk through how you can start immediately migrating to Amazon Redshift across on-prem and cloud environments using Confluent, our ecosystem of pre-built connectors, and ksqlDB for real-time data processing.
During this webinar, Rishi Doerga, Senior Solutions Engineer at Confluent, discusses how event streaming can help modernize your applications, enabling you to become more agile, innovative, and responsive to your customer's needs.
In this hands-on session, you’ll learn how to integrate your IBM mainframe with Confluent in order to unlock Z System data for use in real-time, cloud-native applications.
Partners Tech Talks are webinars where subject matter experts from a Partner talk about a specific use case or project. The goal of Tech Talks is to provide best practices and applications insights, along with inspiration, and help you stay up to date about innovations in confluent ecosystem.
Watch this webinar and transform your data pipeline processes.
In this hands-on session with live Q&A, you’ll learn how to build streaming data pipelines to connect, process, and govern real-time data flows for data warehouses. The demo shows an e-commerce company using streaming pipelines for customer 360.
This webinar covers the operational use case and learnings from SecurityScorecard’s journey from batch to building streaming data pipelines with Confluent.
In this session, we'll explore how Confluent helps companies modernize their database strategy with Confluent Cloud and modern Azure Data Services like Cosmos DB. Confluent accelerates getting data to the cloud and reduces costs by implementing a central-pipeline architecture using Apache Kafka.
Migrating, innovating, or building in the cloud requires retailers to rethink their data infrastructure. Confluent and Azure enable companies to set data in motion across any system, at any scale, in near real-time.
Watch this webinar to find out how a data mesh can bring much-needed order to a system in both cases, resulting in a more mature, manageable, and evolvable data architecture.
Learn how ACERTUS leverages Confluent Cloud and ksqlDB for their streaming data pipelines, data pre-processing and transformations, data warehouse modernization, and their latest data mesh framework project.
In this hands-on session you’ll learn about building trusted shared services with Confluent—a better way to allow safe and secure data sharing. We’ll show you how to enable trusted shared services through OAuth 2.0, role-based access control, and Cloud Client Quotas.
Confluent Infinite Storage allows you to store data in your Kafka cluster indefinitely, opening up new use cases and simplifying your architecture. This hands-on workshop will show you how to achieve real-time and historical processing with a single data streaming platform.
Learn how Apache Kafka® on Confluent Cloud streams massive data volumes to time series collections via the MongoDB Connector for Apache Kafka®.
In this session, you’ll learn how to accelerate your digital transformation using real-time data.
Join Ryan James, Chief Data Officer of Vitality Group, to learn how Vitality Group future-proofed its event-driven microservices with Confluent and AWS
Modernizing your data warehouse doesn’t need to be long or complicated. In this webinar, we’ll walk through how you can start migrating to Databricks immediately across on-prem and cloud environments using Confluent, our ecosystem of pre-built connectors, and ksqlDB for real-time data processing.
This demo will showcase how to use Confluent as a streaming data pipeline between operational databases. We’ll walk through an example of how to connect data and capture change data in real-time from a legacy database such as Oracle to a modern cloud-native database like MongoDB using Confluent.
In this hands-on session you’ll learn about Streaming Data Pipelines which is a better way to build real-time pipelines. The demo shows how an e-commerce company can use a Streaming Data Pipeline for real-time data warehousing.
Listen to this webinar to learn how Confluent Cloud enables your developer community.
Listen back and view the presentations from the Data in Motion Tour 2022 - EMEA.
Data Streaming and Apache Kafka® are two of the world's most relevant and talked about technologies. With the buzz continuing to grow, join this webinar to hear predictions for the 'Top Five Use Cases & Architectures for Data In Motion in 2023'.
Every aspect of the financial services industry is undergoing some form of transformation. By leveraging the power of real-time data streaming, financial firms can drive personalized customer experiences, proactively mitigate cyber risk, and drive regulatory compliance.
Join this demo to see how Confluent’s Stream Governance suite delivers a self-service experience that helps all your teams put data streams to work.
Demand for fast results and decision-making has generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge. Apache Kafka® and the Confluent Platform are designed to solve the problems associated with traditional systems
Network Analytics in a Big Data World: Why telco networks and data mesh need to become one (and how data streaming can save the internet) with Swisscom, NTT, INSA Lyon, and Imply in a panel discussion with Field CTO Kai Waehner.
Developers can focus on building new features and applications, liberated from the operational burden of managing their own Kafka clusters. Join us in these interactive sessions to learn more about Confluent Cloud.
Join Noam Berman, Software Engineer at Wix, for an insight-packed webinar in which he discusses Wix's growing use of Apache Kafka® in recent years.
Kafka Streams transforms your streams of data, be it a stateless transformation like masking personally identifiable information, or a complex stateful operation like aggregating across time windows, or a lookup table.
Confluent’s Stream Designer is a new visual canvas for rapidly building, testing, and deploying streaming data pipelines powered by Kafka.
Tune in to discover how you can avoid common mistakes that could set back your event-driven ambitions—and how Confluent’s fully managed platform can help get you where you need to be faster and with less operational headaches.
This fireside chat will cover Suman’s learnings from implementing 2 critical use cases at Walmart that continue to play a critical role in customer satisfaction: real-time inventory and real-time replenishment.
Tune in to hear Suman share best practices for building real-time use-cases in retail!
Raiffeisen International bank is scaling an event-driven architecture across the group as part of a bank wide transformation program. As technology and architecture leader RBI plays a key role in banking in CEE which will be shared with the audience in this webinar.
This webinar will walk through a story of a Bank who uses an Oracle database to store sensitive customer information and RabbitMQ as the message broker for credit card transaction events.
Kafka is a platform used to collect, store, and process streams of data at scale, with numerous use cases. Join us in this live, interactive session, to learn more about Apache Kafka.
Confluent Cloud alleviates the burden of managing Apache Kafka, Schema Registry, Connect, and ksqlDB so teams can effectively focus on modern app development and deliver immediate value with your real-time use cases.
Many organizations have data locked away in their on-premises data center, making it impossible to take full advantage of modern cloud services. In this session, you’ll learn how to stream your on-prem data to the cloud in near real time.
This demo webinar will provide you with everything you need to get started with the latest updates to our cloud-native data streaming platform, Confluent Cloud.
We’ve got an exciting line up of sessions designed to get you up to speed on all things Confluent Cloud! You’re sure to gain invaluable insights, no matter how many you’re able to join.
Confluent’s Stream Governance suite establishes trust in the real-time data moving throughout your business and delivers an easy, self-service experience for multiple teams to discover, understand, and put these streams to work.
The demand for fast results and decision making, have generated the need for real-time event streaming and processing of data adoption in financial institutions to be on the competitive edge.
Listen back and view the presentations from the Data in Motion Tour 2021 - EMEA.
Kai Waehner, Field CTO at Confluent, will deliver his predictions on the hottest and most important data in motion use cases for 2022.
Confluent hosted a technical thought leadership session to discuss how leading organisations move to real-time architecture to support business growth and enhance customer experience.
To help organisations understand how data in motion can transform business, Watch ‘The Strategic Importance of Data in Motion’ hosted by Tech UK.
In this online talk, we will answer the question, 'How much can we do with Kafka in 30 minutes of coding?'
In this webinar, see how Confluent’s data warehouse modernization solution leverages the Azure Synapse connector to help enterprises create a bridge across your Azure cloud and on-prem environments. We’ll explain how the solution works, and show you a demo!
Apache Kafka® was built with the vision to become the central nervous system that makes real-time data available to all the applications that need to use it, with numerous use cases like stock trading and fraud detection, and real-time analytics.
Today’s data sources are fast-moving and dispersed, which can leave businesses and engineers struggling to deliver data and applications in real-time. While this can be hard, we know it doesn’t have to be - because we’ve already made it easy.
Learn more about Confluent Platform 7.0 and how Cluster Linking enables you to leverage modern cloud-based platforms and build hybrid architectures with a secure, reliable, and cost-effective bridge.
Hivecell and Confluent deliver the promise of bringing a piece of Confluent Cloud right there to your desk and deliver managed Kafka at the edge for the first time at scale
Retailers that have embraced the opportunities of the prolonged pandemic are emerging leaner and stronger than before. Hear Lawrence Stoker, Senior Solutions Engineer at Confluent, walk through the data in motion use cases that are re-inventing the retail business.
Listen to this On-Demand online talk to hear how BT's digital strategy is becoming an event-driven business.
The world is changing! Organisations are now more globally integrated than ever before and new problems need to be solved. As systems scale and migrate into the cloud, those seeking to infiltrate enterprise systems are presented with new and more frequent opportunities to succeed.
We invite you to join Jesse Miller, our lead Product Manager for Health+, in an upcoming webinar to learn about how Health+ can optimize your deployment, give you the highest level of monitoring visibility, and provide intelligent alerts and accelerated support when you need it.
Forrester recently released a Total Economic Impact report that identified $2.5M+ in savings, a 257% ROI, and <6 month payback for organizations that used Confluent Cloud instead of Open Source Apache Kafka.
This webinar explores use cases and architectures for Kafka in the cybersecurity space, also featuring a very prominent example of combining Confluent and Splunk with Intel’s Cyber Intelligence Platform (CIP).
In Sencrop’s case, working with IoT at the “edge” means collecting and providing accurate data from the farm fields. Find out how AWS and Confluent Cloud are powering this real-time processing of data for anomaly detection and weather prediction.
In this session, we'll explore how to build a serverless, event-driven architectures by using AWS Lambda with Kafka. We'll discuss how event-based compute like Lambda can be used to decrease the complexity of running, scaling, and operating stream-based architectures when building new applications.
We’ll discuss the challenges Storyblocks attempted to overcome with their monolithic apps and REST API architecture as the business grew rapidly, and the advantages they gained from using Confluent Event-driven architecture to power their mission-critical microservices.
In this session we will discuss how Apache Kafka has become the de facto standard for event driven architecture, its community support and the scale at which some customers are running it.
In this demo, we’ll show you how to modernize your database and move to the cloud by connecting multi-cloud and hybrid data to Google Cloud SQL in real time.
In this demo, we’ll walk through how you can start building a persistent pipeline for continuous migration from a legacy database to a modern, cloud database. You’ll see how to use Confluent and Amazon Aurora to create a bridge across your Amazon cloud and on-prem environments.
This demo webinar will provide you with everything you need to get started with the latest capabilities of our cloud-native data streaming platform, Confluent Cloud.
The Cloud - as we all know - offers the perfect solution to many challenges. Many organisations are already using fully-managed cloud services such as AWS S3, DynamoDB, or Redshift. This creates an opportunity to implement fully-managed Kafka with ease using Confluent Cloud on AWS.
Join Joseph Morais, Staff Cloud Partner SA, and Braeden Quirante, Cloud Partner SA at Confluent as they discuss Apache Kafka and Confluent.
Join us for this webinar to see how Confluent and Databricks enable companies to set data in motion across any system, at any scale, in near real-time.
In this 30-minute session, top Kafka experts will show everything for quickly getting started with real-time data movement ranging from on-demand cluster creation and data generation through to real-time stream processing and account management.
This demo webinar will show you how Confluent is the world’s most trusted data streaming platform, with resilience, security, compliance, and privacy built-in by default.
Leverage Confluent Cloud and Google Cloud Platform products such as BigQuery to modernize your data in minutes, setting your data in motion.
This webinar will provide you with everything you need to get started with all the latest capabilities available on our cloud-native data streaming platform, Confluent Cloud.
Interested in bringing stream processing to your organization, but unclear on how to get started? Designed to help you go from idea to proof of concept, this online talk dives into a few of the most popular stream processing use cases and workloads to help get you up and running with ksqlDB.
This webinar will address the problems with current approaches and show you how you can leverage Confluent’s platform for data in motion to make your data architecture fast, cost-effective, resilient, and secure.
In this webinar, we'll introduce you to Confluent Platform 7.0, which offers Cluster Linking to enable you to leverage modern cloud-based platforms and build hybrid architectures with a secure, reliable, and cost-effective bridge between on-prem and cloud environments.
Learn how to break data silos and accelerate time to market for new applications by connecting valuable data from your existing systems on-prem to your AWS environment using Confluent.
Today, with Confluent, enterprises can stream data across hybrid and multicloud environments to Amazon Redshift, powering real-time analysis while reducing total cost of ownership and time to value.
In this webinar, we'll introduce you to Confluent Platform 6.2, which offers Health+, a new feature that includes intelligent alerting and cloud-based monitoring tools to reduce the risk of downtime, streamline troubleshooting, surface key metrics, and accelerate issue resolution.
This webinar presents the decision making framework we use to coach our customers toward the most impactful and lowest cost PoC built on Kafka. The framework considers business impact, technology learning, existing resources, technical backgrounds, and cost to ensure the greatest chance of success.
Kafka Streams, a scalable stream processing client library in Apache Kafka, defines the processing logic as read-process-write cycles in which all processing state updates and result outputs are captured as log appends.
Learn how Confluent Cluster Linking can seamlessly integrate and share data across these environments in real-time by leveraging your current Confluent/Apache Kafka deployments.
Hear how Fortune 500 companies and leading technology providers are driving real-time innovation through the power of data in motion to deliver richer customer experiences and automate backend operations.
In this short, 20-minute session you’ll gain everything you need to get started with development of your first app based upon event-driven microservices.
Today, with Confluent, enterprises can stream data across hybrid and multicloud environments to Google Cloud’s BigQuery, powering real-time analysis while reducing total cost of ownership and time to value.
In this Online Talk you will learn:
During this session you’ll see a pipeline built with data extraction from MongoDB Atlas, real-time transformation with ksqlDB, and simple loading into Snowflake.
This webinar presents a solution using Confluent Cloud on Azure, Azure Cosmos DB and Azure Synapse Analytics which can be connected in a secure way within Azure VNET using Azure Private link configured on Kafka clusters.
Watch this webinar to hear more about how Generali, Skechers and Conrad Electronics are using Qlik and Confluent to increase Kafka’s value.
This webinar will cover how you can protect your Kafka use cases with enterprise-grade security, reduce your Kafka operational burden and instead focus on building real-time apps that drive your business forward, and pursue hybrid and multi-cloud architectures with a data platform.
Establish event streaming as the central nervous system of your entire business, perhaps starting with a single use case and eventually architecting a system around event-driven microservices or delivering net-new capabilities like streaming ETL or a comprehensive customer 360.
Learn how teams around the world continue building innovative, mission-critical applications fueled by data in motion. This 4-part webinar series will provide you with bite-sized tutorials for how to get started with all the latest capabilities available on the platform.
With Confluent, you can start streaming data into MongoDB Atlas in just a few easy clicks. Learn how to bring real-time capabilities to your business and applications by setting data in motion.
Watch this session to learn how to streamline infrastructure, increase development velocity, unveil new use cases, and analyze data in real-time.
Join Confluent and Imply at this joint webinar to explore and learn the use cases about how Apache Kafka® integrates with Imply to bring in data-in-motion and real-time analytics to life.
In this presentation, Lyndon Hedderly, Team Lead of Business Value Consulting at Confluent, will cover how Confluent works with customers to measure the business value of data streaming.
By shifting to a fully managed, cloud-native service for Kafka, you can unlock your teams to work on the projects that make the best use of your data in motion.
Listen back and view the presentations from the Confluent Streaming Event Series in Europe 2020
A l’issu de ce talk vous saurez: • Mettre des mots sur les difficultés fondamentales de nos systèmes • Déterminer le rôle du streaming dans vos architecture • Présenter des cas d’usages concrets dans l’usage du streaming
The ASAPIO Connector for Confluent allows true application-based change data capture, along with full database access. This webinar will showcase a SAP- and Confluent-certified solution to enable real-time event streaming for on-prem SAP data.
Learn about the benefits of leveraging a cloud-native service for Kafka, and how you can lower your total cost of ownership (TCO) by 60% with Confluent Cloud while streamlining your DevOps efforts. Priya Shivakumar, Head of Product, Confluent Cloud, will share two short demos.
Hands-on workshop: Using Kubernetes, Spring Boot, Kafka Streams, and Confluent Cloud to rate Christmas movies.
Modern streaming data technologies like Apache Kafka® and Confluent KSQL, the streaming SQL engine for Apache Kafka, can help companies catch and detect fraud in real time instead of after the fact.
Mainframe offloading with Apache Kafka and its ecosystem can be used to keep a more modern data store in real-time sync with the mainframe. At the same time, it is persisting the event data on the bus to enable microservices, and deliver the data to other systems such as data warehouses and search indexes.
Learn how Generali Switzerland set up an event-driven architecture to support their digital transformation project.
In this talk, we survey the stream processing landscape, the dimensions along which to evaluate stream processing technologies, and how they integrate with Apache Kafka®. Part 5 in the Apache Kafka: Online Talk Series.
Neha Narkhede explains how Apache Kafka was designed to support capturing and processing distributed data streams by building up the basic primitives needed for a stream processing system.
In this talk, we'll build a streaming data pipeline using nothing but our bare hands, the Kafka Connect API and KSQL.
In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka® service, to migrate to GCP.
In this talk Gwen Shapira will break through the clutter and look at how successful companies are adopting centralized streaming platforms, and the use-cases and methodologies that we see practiced right now.
In this online talk, we’ll explore how and why companies are leveraging Confluent and MongoDB to modernize their architecture and leverage the scalability of the cloud and the velocity of streaming.
Recording from QCon New York 2017 Gwen Shapira discusses patterns of schema design, schema storage and schema evolution that help development teams build better contracts through better collaboration - and deliver resilient applications faster.
In this talk, I'll describe some of the design tradeoffs when building microservices, and how Apache Kafka's powerful abstractions can help.
There’s a prevailing enterprise perception that compliance with data protection regulations and standards is a burden: limiting the leverage of data.
In this talk by Jun Rao, co-creator of Apache Kafka®, get a deep dive on some of the key internals that makes Apache Kafka popular, including how it delivers reliability and compaction. Part 2 in the Apache Kafka: Online Talk Series.
In this talk, Gwen Shapira describes the reference architecture of Confluent Enterprise, which is the most complete platform to build enterprise-scale streaming pipelines using Apache Kafka®. Part 1 in the Best Practices for Apache Kafka in Production Series.
Get answers to: How you would use Apache Kafka® in a micro-service application? How do you build services over a distributed log and leverage the fault tolerance and scalability that comes with it?
With the evolution of data-driven strategies, event-based business models are influential in innovative organizations.
This session shows how various sub-systems in Apache Kafka can be used to aggregate, integrate and attribute these signals into signatures of interest.
Learn from field experts as they discuss how to convert the data locked in traditional databases into event streams using HVR and Apache Kafka®.
In this webinar we want to share our experience on how the Swiss Mobiliar, the biggest Swiss household insurance enterprise, introduced Kafka and led it to enterprise-wide adoption with the help of AGOORA.com.
This talk takes an in-depth look at how Apache Kafka® can be used to provide a common platform on which to build data infrastructure driving both real-time analytics as well as event-driven applications.
Real-time data has value. But how do you quantify that value. This talk explores why valuing Kafka is important - but covers some of the problems in quantifying the value of a data infrastructure platform.
Apache Kafka is an open source event streaming platform. It is often used to complement or even replace existing middleware to integrate applications and build microservice architectures. Apache Kafka is already used in various projects in almost every bigger company today. Understood, battled-tested, highly scalable, reliable, real-time. Blockchain is a different story. This technology is a lot in the news, especially related to cryptocurrencies like Bitcoin. But what is the added value for software architectures? Is blockchain just hype and adds complexity? Or will it be used by everybody in the future, like a web browser or mobile app today? And how is it related to an integration architecture and event streaming platform? This session explores use cases for blockchains and discusses different alternatives such as Hyperledger, Ethereum and a Kafka-native tamper-proof blockchain implementation. Different architectures are discussed to understand when blockchain really adds value and how it can be combined with the Apache Kafka ecosystem to integrate blockchain with the rest of the enterprise architecture to build a highly scalable and reliable event streaming infrastructure. Speakers: Kai Waehner, Technology Evangelist, Confluent Stephen Reed, CTO, Co-Founder, AiB
Join us as we walk through an overview of this exciting new service from the experts in Kafka. Learn how to build robust, portable and lock-in free streaming applications using Confluent Cloud.
In this interactive discussion, the KSQL team will answer 10 of the toughest, most frequently asked questions about KSQL.
Confluent KSQL is the streaming SQL engine that enables real-time data processing against Apache Kafka®. It provides an easy-to-use, yet powerful interactive SQL interface for stream processing on Kafka.
One of the largest banks in the world—with 16 million clients globally—RBC built a real-time, scalable and event-driven data architecture for their rapidly growing number of cloud, machine learning and AI initiatives.
The ‘current state of stream processing’ walks through the origins of stream processing, applicable use cases and then dives into the challenges currently facing the world of stream processing as it drives the next data revolution.
This talk provides a deep dive into the details of the rebalance protocol, starting from its original design in version 0.9 up to the latest improvements and future work.
Learn about the impact of Confluent and Apache Kafka® on Funding Circle’s lending marketplace, from Kafka Connect to Exactly-Once processing.
This online talk will showcase how Apache Kafka® plays a key role within Express Scripts’ transformation from mainframe to a microservices-based ecosystem, ensuring data integrity between two worlds.
In this talk, we are going to observe the natural journey companies undertake to become real-time, the possibilities it opens for them, and the challenges they will face
Kafka has a set of new features supporting idempotence and transactional writes that support building real-time applications with exactly-once semantics. This talk provides an overview of these features.
This talk discusses the key design concepts within Apache Kafka Connect and the pros and cons of standalone vs distributed deployment modes.
Technologies open up a range of use cases for Financial Services organisations, many of which will be explored in this talk. .
Detecting fraudulent activity in real time can save a business significant amounts of money, but has traditionally been an area requiring a lot of complex programming and frameworks, particularly at scale.
Robin discusses the role of Apache Kafka as the de facto standard streaming data processing platforms.
Join this Online Talk, to understand how and why Apache Kafka has become the de-facto standard for reliable and scalable streaming infrastructures in the finance industry.
Join us as we build a complete streaming application with KSQL. There will be plenty of hands-on action, plus a description of our thought process and design choices along the way. Part 2 in the Empowering Streams through KSQL series.
Join the discussion on the relationship between microservices and stream processing with Data-Intensive Apps author Martin Kleppmann, Confluent engineers Damian Guy and Ben Stopford, chaired by Jay Kreps, co-founder and CEO, Confluent.
In this video, Tim Berglund explains how you can speed up development with the Confluent Command Line Interface (CLI), which allows you to quickly iterate while implementing your applications and enables you to interact with the Confluent ecosystem.
This talk explores the benefits around cloud-native platforms and running Apache Kafka on Kubernetes, what kinds of workloads are best suited for this combination, and best practices.
In this session, we'll compare the two approaches to data integration and show how Dataflow allows you to join and transform and deliver data streams among on-prem and cloud Apache Kafka clusters, Cloud Pub/Sub topics and a variety of databases.
This online talk dives into the new Verified Integrations Program and the integration requirements, the Connect API and sources and sinks that use Kafka Connect. Part 2 of 2 in Building Kafka Connectors - The Why and How
This video offers an introduction to Kafka stream processing, with a focus on KSQL.
In this talk we'll examine how Stateful Stream Processing can be used to build Event Driven Services, using a distributed log like Apache Kafka. In doing so this Data-Dichotomy is balanced with an architecture that exhibits demonstrably better scaling properties, be it increased complexity, team size, data volume or velocity.
During this online talk, presenters from Confluent and Qlik will demonstrate how to accelerate data delivery to enable real-time analytics, make data more valuable with real-time data ingestion to Kafka, modernize data centers by streaming data in real-time, and demo a customer use case for advanced analytics.
Hans Jespersen (VP WW Systems Engineering, Confluent) Opened afternoon presentations: Confluent Cloud: Agility for the modern data-driven enterprise at Confluent’s streaming event in Paris.
Learn about the KSQL architecture and how to design and deploy interactive, continuous queries for streaming ETL and real-time analytics.
In this all too fabulous talk, we will be addressing the wonderful and new wonders of KSQL vs. KStreams and how Ticketmaster uses KSQL and KStreams in production to reduce development friction in machine learning products.
In the world of online streaming providers, real-time events are becoming the new standard, driving innovation and a new set of use cases to react to a quickly changing market. We explain how, from simple media player heartbeats, Data Reply fueled a diverse set of near-real-time use cases and services for his customer, from blocking concurrent media streams, to recognizing ended sessions and trending content.
In this technical deep dive, we’ll discuss the proposition of Incremental Cooperative Rebalancing as a way to alleviate stop-the-world and optimize rebalancing in Kafka APIs.
This interactive whiteboard presentation discusses use cases leveraging the Apache Kafka® open source ecosystem as an event streaming platform to process IoT data.
Gwen Shapira presents core patterns of modern data engineering and explains how you can use microservices, event streams and a streaming platform like Apache Kafka to build scalable and reliable data pipelines. Part 1 of 3 in Streaming ETL - The New Data Integration series.
Without any coding or scripting, end-users leverage their existing spreadsheet skills to build customized streaming apps for analysis, dashboarding, condition monitoring or any kind of real-time pre-and post-processing of Kafka or KsqlDB streams and tables.
In this online talk, you will learn why, when facing Open Banking regulation and rapidly increasing transaction volumes, Nationwide decided to take load off their back-end systems through real-time streaming of data changes into Apache Kafka®.
Get an introduction to and demo of KSQL, Streaming SQL for Apache Kafka.
In this online talk, Bosch’s Ralph Debusmann outlines their architectural vision for bringing many data streams into a single platform, surrounded by databases that can power complex real-time analytics.
Michael Noll provides an introduction to stream processing, use cases, and Apache Kafka.
In this online talk, Joe Beda, CTO of Heptio and co-creator of Kubernetes, and Gwen Shapira, principal data architect at Confluent and Kafka PMC member, will help you navigate through the hype, address frequently asked questions and deliver critical information to help you decide if running Kafka on Kubernetes is the right approach for your organization.
In this Online Talk Henrik Janzon, Solutions Engineer at Confluent, explains Apache Kafka’s internal design and architecture.
This talk will cover how to integrate real-time analytics and visualizations to drive business processes and how KSQL, streaming SQL for Kafka, can easily transform and filter streams of data in real time.
What is microservices? And how does it work in the Apache Kafka ecosystem.
In this talk, Matt Howlett will give a technical overview of Kafka, discuss some typical use cases (from surge pricing to fraud detection to web analytics) and show you how to use Kafka from within your C#/.NET applications.
Learn how to map practical data problems to stream processing and write applications that process streams of data at scale using Kafka Streams. Part 4 in the Apache Kafka: Online Talk Series.
Tim Berglund covers the patterns and techniques of using KSQL. Part 1 of the Empowering Streams through KSQL series.
Rabobank rose to this challenge and defined the Business Event Bus (BEB) as the place where business events from across the organization are shared between applications.
Learn how Centene improved their ability to interact and engage with healthcare providers in real time with MongoDB and Confluent Platform.
In this talk, get a short introduction to common approaches and architectures (lambda, kappa) for streaming processing and learn how to use open-source steam processing tools (Flink, Kafka Streams, Hazelcast Jet) for stream processing.
This practical talk will dig into how we piece services together in event driven systems, how we use a distributed log to create a central, persistent narrative and what benefits we reap from doing so. Part 2 in the Apache Kafka® for Microservices: A Confluent Online Talk Series.
Presentation from Apache Kafka Meetup at Strata San Jose (3/14/17). Jay Kreps will introduce Kafka and explain why it has become the de facto standard for streaming data.
In this session, we will identify and demo some best practices for implementing a large scale IoT system that can stream MQTT messages to Apache Kafka.
What was once a ‘batch’ mindset is quickly being replaced with stream processing as the demands of the business impose real-time requirements on technology leaders.
In this talk, we are going to show some example use cases that Data Reply developed for some of its customers and how Real-Time Decision Engines had an impact on their businesses.
In this session, we discuss disaster scenarios that can take down entire Apache Kafka® clusters and share advice on how to plan, prepare and handle these events. Part 4 in the Best Practices for Apache Kafka in Production Series.
In this session, we discuss the basic patterns of multi-datacenter Apache Kafka® architectures, explore some of the use cases enabled by each architecture and show how Confluent Enterprise products make these patterns easy to implement. Part 3 in the Best Practices for Apache Kafka in Production Series.
We explain how the microservice ecosystem around Apache Kafka was built to ensure the ability to build and deploy new streaming agents on AWS fast and with the least amount of operational effort possible, as well as some of the issues we found and worked around.
In this talk, we’ll review the breadth of Apache Kafka as a streaming data platform, including, its internal architecture and its approach to pub/sub messaging.
This online talk explores how Apache Druid and Apache Kafka® can turn a microservices ecosystem into a distributed real-time application with instant analytics.
Experts from Confluent and Attunity share how you can: realize the value of streaming data ingest with Apache Kafka®, turn databases into live feeds for streaming ingest and processing, accelerate data delivery to enable real-time analytics and reduce skill and training requirements for data ingest.
Learn how AO.com are enabling real-time event-driven applications to improve customer experience using Confluent Platform.
This online talk focuses on the key business drivers behind connecting to Kafka and introduces the new Confluent Verified Integrations Program. Part 1 of 2 in Building Kafka Connectors - The Why and How
In this talk, we’ll explain the architectural reasoning for Apache Kafka® and the benefits of real-time integration, and we’ll build a streaming data pipeline using nothing but our bare hands, Kafka Connect and KSQL.
In this session, we go over everything that happens to a message – from producer to consumer, and pinpoint all the places where data can be lost. Build a bulletproof data pipeline with Apache Kafka. Part 2 in the Best Practices for Apache Kafka in Production Series.
In this talk, members of the Pinterest team offer lessons learned from their Confluent Go client migration and discuss their use cases for adopting Kafka Streams.
The Fourth Industrial Revolution (also known as Industry 4.0) is the ongoing automation of traditional manufacturing and industrial practices, using modern smart technology. Event Streaming with Apache Kafka plays a massive role in processing massive volumes of data in real-time in a reliable, scalable, and flexible way using integrating with various legacy and modern data sources and sinks.
Join the Confluent Product team as we provide a technical overview of Confluent Platform 5.4, which delivers groundbreaking enhancements in the areas of security, disaster recovery and scalability.
This talk looks at one of the most common integration requirements – connecting databases to Apache Kafka.
Databases represent some of the most successful software that has ever been written and their importance over the last fifty years is hard to overemphasize. Over this time, they have evolved to form a vast landscape of products that cater to different data types, volumes, velocities, and query characteristics. But the broad definition of what a database is has changed relatively little.
Learn about typical Apache Kafka use cases and how organisations can process large quantities of data in real time using the Kafka Streams API and KSQL.
This talk focuses on how to integrate all the components of the Apache Kafka® ecosystem into an enterprise environment and what you need to consider as you move into production. Part 6 of the Apache Kafka: Online Talk Series.
Large enterprises, government agencies, and many other organisations rely on mainframe computers to deliver the core systems managing some of their most valuable and sensitive data. However, the processes and cultures around a mainframe often prevent the adoption of the agile, born-on-the web practices that have become essential to developing cutting edge internal and customer-facing applications.
This session will show you how to get streams of data into and out of Kafka with Kafka Connect and REST Proxy, maintain data formats and ensure compatibility with Schema Registry and Avro, and build real-time stream processing applications with Confluent KSQL and Kafka Streams.
In this talk we will look at what event driven systems are; how they provide a unique contract for services to communicate and share data and how stream processing tools can be used to simplify the interaction between different services.
This session covers architectures best practises and recommendations for organisations aiming for a more cloud-centric approach in the use of Apache Kafka.
You know the fundamentals of Apache Kafka. You are a Spring Boot developer and working with Apache Kafka. You have chosen Spring Kafka to integrate with Apache Kafka. You implemented your first producer, consumer, and maybe some Kafka streams, it's working... Hurray! You are ready to deploy to production what can possibly go wrong?
This talk showcases different use cases in automation and Industrial IoT (IIoT) where an event streaming platform adds business value.
Event Streaming Paradigm: rethink data as not stored records or transient messages, but instead as a continually updating stream of events.
This talk will examine the underlying dichotomy we all face as we piece such systems together--one that is not well served today. The solution lies in blending the old with the new and Apache Kafka® plays a central role. Part 1 in the Apache Kafka for Microservices: A Confluent Online Talk Series.
See how Kinetica enables businesses to leverage the streaming data delivered with Confluent Platform to gain actionable insights.
Learn different options for integrating systems and applications with Apache Kafka® and best practices for building large-scale data pipelines using Apache Kafka. Part 3 in the Apache Kafka: Online Talk Series.
Join The New York Times' Director of Engineering Boerge Svingen to learn how the innovative news giant of America transformed the way it sources content—all through the power of a real-time streaming platform.
In this online talk, Technology Evangelist Kai Waehner will discuss and demo how you can leverage technologies such as TensorFlow with your Kafka deployments to build a scalable, mission-critical machine learning infrastructure for ingesting, preprocessing, training, deploying and monitoring analytic models.
In this session, we will share how companies around the world are using Confluent Cloud, a fully managed Apache Kafka® service, to migrate to AWS.
Learn typical use cases for Apache Kafka®, how you can get real-time data streaming from Oracle databases to move transactional data to Kafka and enable continuous movement of your data to provide access to real-time analytics.
In this session, Nick Dearden covers the planning and operation of your KSQL deployment, including under-the-hood architectural details. Part 3 out of 3 in the Empowering Streams through KSQL series.
In this webinar, we take a hands-on approach to these questions and walk through setting up a simple application written in .NET to a Confluent Cloud based Kafka cluster. Along the way, we point out best practices for developing and deploying applications that scale easily.
Learn about the recent additions to Apache Kafka® to achieve exactly-once semantics (EoS) including support for idempotence and transactions in the Kafka clients.
This online talk is based on real-world experience of Kafka deployments and explores a collection of common mistakes that are made when running Kafka in production and some best practices to avoid them.
Hear from Intrado’s Thomas Squeo, CTO, and Confluent’s Chief Customer Officer, Roger Scott, to learn how Intrado future-proofed their architecture to support current and future real-time business initiatives.
Learn how NAV (Norwegian Work and Welfare Department) are using Apache Kafka to distribute and act upon events. NAV currently distributes more than one-third of the national budget to citizens in Norway or abroad. They are there to assist people through all phases of life within the domains of work, family, health, retirement, and social security. Events happening throughout a person’s life determines which services NAV provides to them, how they provide them, and when they offer them.
This talk will look at how Stateful Stream Processing is used to build truly autonomous services. With the distributed guarantees of Exactly Once Processing, Event-Driven Services supported by Apache Kafka®. Part 3 in the Apache Kafka for Microservices: A Confluent Online Talk Series.
Joe Beda, CTO of Heptio and co-creator of Kubernetes, and Gwen Shapira, principal data architect at Confluent, will help you decide if running Kafka on Kubernetes is the right approach for your organization.
We’ll discuss how to leverage some of the more advanced transformation capabilities available in both KSQL and Kafka Connect. Part 3 of 3 in Streaming ETL - The New Data Integration online talk series.
Event streaming: from technology to a completely new business paradigm.
This online talk includes in depth practical demonstrations of how Confluent and Panopticon together support several key financial services and IoT applications, including transaction cost analysis and risk monitoring.
Microservices guru Sam Newman, Buoyant CTO Oliver Gould and Apache Kafka® engineer Ben Stopford are joined by Jay Kreps, co-founder and CEO, Confluent for a Q&A session where they discuss and debate all things Microservices.
HomeAway, the world’s leading online marketplace for the vacation rental industry, uses Apache Kafka® and Confluent to match travelers with 2 million+ unique places to stay in 190 countries.
Industry 4.0 and smart manufacturing are driving the manufacturing industry to modernize their software infrastructure. This session will look at the unique business drivers for modernizing the manufacturing industry and how MQTT and Kafka can help make it a reality.
Learn how companies will leverage event streaming, Apache Kafka, and Confluent to meet the demand of a real-time market, rising regulations, and customer expectations, and much more in 2021
Get an introduction to Apache Kafka® and how it serves as a foundation for streaming data pipelines and applications that consume/process real-time data streams. Part 1 in the Apache Kafka: Online Talk Series.
Join The New York Times' Director of Engineering Boerge Svingen to learn how the innovative news giant of America transformed the way it sources content while still maintaining searchability, accuracy and accessibility—all through the power of a real-time streaming platform.
Explore the use cases and architecture for Apache Kafka®, and how it integrates with MongoDB to build sophisticated data-driven applications that exploit new sources of data.
Pick up best practices for developing applications that use Apache Kafka, beginning with a high level code overview for a basic producer and consumer.
Operating a complex distributed system such as Apache Kafka could be a lot of work. In this talk we will review common issues, and mitigation strategies, seen from the trenches helping teams around the globe with their Kafka infrastructure.
Neha Narkhede talks about the experience at LinkedIn moving from batch-oriented ETL to real-time streams using Apache Kafka and how the design and implementation of Kafka was driven by this goal of acting as a real-time platform for event data.
Watch Lyndon Hedderly's keynote from Big Data Analytics London 2018.
Originally presented by Gwen Shapira at Gluecon 2018, this talk covers the similarities and differences between the communication layer provided by a service mesh and Apache Kafka and their implementations, as well as ways you can combine them together.
This 60-minute online talk is packed with practical insights where you will learn how Kafka fits into a data ecosystem that spans a global enterprise and supports use cases for both data ingestion and integration
Join experts from VoltDB and Confluent to see why and how enterprises are using Apache Kafka as the central nervous system in combination with VoltDB.
Confluent Co-founder Jun Rao discusses how Apache Kafka® became the predominant publish/subscribe messaging system that it is today, Kafka's most recent additions to its enterprise-level set of features and how to evolve your Kafka implementation into a complete real-time streaming data platform.
Join Kai Waehner, Technology Evangelist at Confluent, for this session which explores various telecommunications use cases, including data integration, infrastructure monitoring, data distribution, data processing and business applications. Different architectures and components from the Kafka ecosystem are also discussed.
In this presentation, we discuss best practices of monitoring Apache Kafka®. Part 5 of the Best Practices for Apache Kafka in Production series.
Learn how Apache Kafka and Confluent help the gaming industry leverage real-time integration, event streaming, and data analytics for seamless gaming experiences at scale.