Close Menu
    Main Menu
    • Home
    • News
    • Tech
    • Robotics
    • ML & Research
    • AI
    • Digital Transformation
    • AI Ethics & Regulation
    • Thought Leadership in AI

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Hackers Breach Toptal GitHub, Publish 10 Malicious npm Packages With 5,000 Downloads

    July 29, 2025

    You must flip off this default TV setting ASAP – and why even consultants advocate it

    July 29, 2025

    Prime Abilities Information Scientists Ought to Study in 2025

    July 29, 2025
    Facebook X (Twitter) Instagram
    UK Tech InsiderUK Tech Insider
    Facebook X (Twitter) Instagram
    UK Tech InsiderUK Tech Insider
    Home»Emerging Tech»Batch information processing is just too sluggish for real-time AI: How open-source Apache Airflow 3.0 solves the problem with event-driven information orchestration
    Emerging Tech

    Batch information processing is just too sluggish for real-time AI: How open-source Apache Airflow 3.0 solves the problem with event-driven information orchestration

    Sophia Ahmed WilsonBy Sophia Ahmed WilsonApril 23, 2025No Comments7 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
    Batch information processing is just too sluggish for real-time AI: How open-source Apache Airflow 3.0 solves the problem with event-driven information orchestration
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link

    Be a part of our day by day and weekly newsletters for the most recent updates and unique content material on industry-leading AI protection. Be taught Extra


    Transferring information from numerous sources to the suitable location for AI use is a difficult job. That’s the place information orchestration applied sciences like Apache Airflow slot in.

    At present, the Apache Airflow group is out with its greatest replace in years, with the debut of the three.0 launch. The brand new launch marks the primary main model replace in 4 years. Airflow has been energetic, although, steadily incrementing on the two.x collection, together with the 2.9 and 2.10 updates in 2024, which each had a heavy give attention to AI.

    In recent times, information engineers have adopted Apache Airflow as their de facto normal instrument. Apache Airflow has established itself because the main open-source workflow orchestration platform with over 3,000 contributors and widespread adoption throughout Fortune 500 corporations. There are additionally a number of business providers based mostly on the platform, together with Astronomer Astro, Google Cloud Composer, Amazon Managed Workflows for Apache Airflow (MWAA) and Microsoft Azure Information Manufacturing facility Managed Airflow, amongst others.

    As organizations wrestle to coordinate information workflows throughout disparate methods, clouds and more and more AI workloads, organizations have rising wants. Apache Airflow 3.0 addresses important enterprise wants with an architectural redesign that would enhance how organizations construct and deploy information functions.

    “To me, Airflow 3 is a brand new starting, it’s a basis for a a lot better units of capabilities,” Vikram Koka, Apache Airflow PMC (undertaking administration committee ) member and Chief Technique Officer at Astronomer, instructed VentureBeat in an unique interview. “That is virtually an entire refactor based mostly on what enterprises instructed us they wanted for the subsequent degree of mission-critical adoption.”

    Enterprise information complexity has modified information orchestration wants

    As companies more and more depend on data-driven decision-making, the complexity of knowledge workflows has exploded. Organizations now handle intricate pipelines spanning a number of cloud environments, numerous information sources and more and more subtle AI workloads.

    Airflow 3.0 emerges as an answer particularly designed to fulfill these evolving enterprise wants. Not like earlier variations, this launch breaks away from a monolithic package deal, introducing a distributed consumer mannequin that gives flexibility and safety. This new structure permits enterprises to:

    1. Execute duties throughout a number of cloud environments.
    2. Implement granular safety controls.
    3. Help numerous programming languages.
    4. Allow true multi-cloud deployments.

    Airflow 3.0’s expanded language assist can be attention-grabbing. Whereas earlier variations have been primarily Python-centric, the brand new launch natively helps a number of programming languages. 

    Airflow 3.0 is about to assist Python and Go together with deliberate assist for Java, TypeScript and Rust. This strategy means information engineers can write duties of their most well-liked programming language, decreasing friction in workflow growth and integration.

    Occasion-driven capabilities remodel information workflows

    Airflow has historically excelled at scheduled batch processing, however enterprises more and more want real-time information processing capabilities. Airflow 3.0 now helps that want.

    “A key change in Airflow 3 is what we name event-driven scheduling,” Koka defined.

    As a substitute of working a knowledge processing job each hour, Airflow now robotically begins the job when a selected information file is uploaded or when a specific message seems. This might embody information loaded into an Amazon S3 cloud storage bucket or a streaming information message in Apache Kafka.

    The event-driven scheduling functionality addresses a important hole between conventional ETL [Extract, Transform and Load] instruments and stream processing frameworks like Apache Flink or Apache Spark Structured Streaming, permitting organizations to make use of a single orchestration layer for each scheduled and event-triggered workflows.

    Airflow will speed up enterprise AI inference execution and compound AI

    The event-driven information orchestration will even assist Airflow to assist speedy inference execution.

    For example, Koka detailed a use case the place real-time inference is used for skilled providers like authorized time monitoring. In that situation, Airflow can be utilized to assist acquire uncooked information from sources like calendars, emails and paperwork. A big language mannequin (LLM) can be utilized to rework unstructured info into structured information. One other pre-trained mannequin can then be used to investigate the structured time monitoring information, decide if the work is billable, then assign acceptable billing codes and charges.

    Koka referred to this strategy as a compound AI system – a workflow that strings collectively totally different AI fashions to finish a posh job effectively and intelligently. Airflow 3.0’s event-driven structure makes any such real-time, multi-step inference course of attainable throughout varied enterprise use instances. 

    Compound AI is an strategy that was first outlined by the Berkeley Synthetic Intelligence Analysis Heart in 2024 and is a bit totally different from agentic AI. Koka defined that agentic AI permits for autonomous AI determination making, whereas compound AI has predefined workflows which can be extra predictable and dependable for enterprise use instances.

    Taking part in ball with Airflow, how the Texas Rangers look to profit

    Among the many many customers of Airflow is the Texas Rangers main league baseball staff.

    Oliver Dykstra, full-stack information engineer on the Texas Rangers Baseball Membership, instructed VentureBeat that the staff makes use of Airflow hosted on Astronomer’s Astro platform because the ‘nerve heart’ of baseball information operations. He famous that each one participant growth, contracts, analytics and naturally, sport information is orchestrated by way of Airflow. 

    “We’re wanting ahead to upgrading to Airflow 3 and its enhancements to event-driven scheduling, observability and information lineage,” Dykstra acknowledged. “As we already depend on Airflow to handle our important AI/ML pipelines, the added effectivity and reliability of Airflow 3 will assist improve belief and resiliency of those information merchandise inside our total group.”

    What this implies for enterprise AI adoption

    For technical decision-makers evaluating information orchestration technique, Airflow 3.0 delivers actionable advantages that may be applied in phases.

    Step one is evaluating present information workflows that will profit from the brand new event-driven capabilities. Organizations can establish information pipelines that at the moment set off scheduled jobs, however event-based triggers may very well be managed extra effectively. This shift can considerably scale back processing latency whereas eliminating wasteful polling operations.

    Subsequent, expertise leaders ought to assess their growth environments to find out if Airflow’s new language assist may consolidate fragmented orchestration instruments. Groups at the moment sustaining separate orchestration instruments for various language environments can start planning a migration technique to simplify their expertise stack.

    For enterprises main the way in which in AI implementation, Airflow 3.0 represents a important infrastructure element that may handle a major problem in AI adoption: orchestrating complicated, multi-stage AI workflows at enterprise scale. The platform’s skill to coordinate compound AI methods may assist allow organizations to maneuver past proof-of-concept to enterprise-wide AI deployment with correct governance, safety and reliability.

    Each day insights on enterprise use instances with VB Each day

    If you wish to impress your boss, VB Each day has you lined. We provide the inside scoop on what corporations are doing with generative AI, from regulatory shifts to sensible deployments, so you’ll be able to share insights for max ROI.

    Learn our Privateness Coverage

    Thanks for subscribing. Try extra VB newsletters right here.

    An error occured.


    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Sophia Ahmed Wilson
    • Website

    Related Posts

    You must flip off this default TV setting ASAP – and why even consultants advocate it

    July 29, 2025

    Energy of TAM, SAM and SOM in Enterprise Progress

    July 28, 2025

    Do falling delivery charges matter in an AI future?

    July 28, 2025
    Top Posts

    Hackers Breach Toptal GitHub, Publish 10 Malicious npm Packages With 5,000 Downloads

    July 29, 2025

    Evaluating the Finest AI Video Mills for Social Media

    April 18, 2025

    Utilizing AI To Repair The Innovation Drawback: The Three Step Resolution

    April 18, 2025

    Midjourney V7: Quicker, smarter, extra reasonable

    April 18, 2025
    Don't Miss

    Hackers Breach Toptal GitHub, Publish 10 Malicious npm Packages With 5,000 Downloads

    By Declan MurphyJuly 29, 2025

    In what is the newest occasion of a software program provide chain assault, unknown risk…

    You must flip off this default TV setting ASAP – and why even consultants advocate it

    July 29, 2025

    Prime Abilities Information Scientists Ought to Study in 2025

    July 29, 2025

    Apera AI closes Sequence A financing, updates imaginative and prescient software program, names executives

    July 29, 2025
    Stay In Touch
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo

    Subscribe to Updates

    Get the latest creative news from SmartMag about art & design.

    UK Tech Insider
    Facebook X (Twitter) Instagram
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms Of Service
    • Our Authors
    © 2025 UK Tech Insider. All rights reserved by UK Tech Insider.

    Type above and press Enter to search. Press Esc to cancel.