Close Menu
    Main Menu
    • Home
    • News
    • Tech
    • Robotics
    • ML & Research
    • AI
    • Digital Transformation
    • AI Ethics & Regulation
    • Thought Leadership in AI

    Subscribe to Updates

    Get the latest creative news from FooBar about art, design and business.

    What's Hot

    Prime LiDAR Annotation Corporations for AI & 3D Level Cloud Knowledge

    March 13, 2026

    Feds Dismantle SocksEscort Proxy Community Utilized in World Fraud

    March 13, 2026

    NYT Pips hints, solutions for March 13, 2026

    March 13, 2026
    Facebook X (Twitter) Instagram
    UK Tech InsiderUK Tech Insider
    Facebook X (Twitter) Instagram
    UK Tech InsiderUK Tech Insider
    Home»Robotics»Galbot Unveils Twin Breakthroughs in Embodied AI: DexNDM and NavFoM Revolutionize Dexterous Manipulation and Autonomous Navigation
    Robotics

    Galbot Unveils Twin Breakthroughs in Embodied AI: DexNDM and NavFoM Revolutionize Dexterous Manipulation and Autonomous Navigation

    Arjun PatelBy Arjun PatelNovember 17, 2025No Comments6 Mins Read
    Facebook Twitter Pinterest Telegram LinkedIn Tumblr Email Reddit
    Galbot Unveils Twin Breakthroughs in Embodied AI: DexNDM and NavFoM Revolutionize Dexterous Manipulation and Autonomous Navigation
    Share
    Facebook Twitter LinkedIn Pinterest Email Copy Link


    Developed in collaboration with Tsinghua College, Peking College, College of Adelaide, and Zhejiang College, these improvements mark a major leap ahead in robotics, enabling robots to autonomously carry out advanced duties throughout numerous environments.

    Galbot is proud to announce two main technological developments in embodied intelligence: DexNDM, a neural dynamics mannequin designed to revolutionize robotic dexterous manipulation, and NavFoM, the world’s first cross-embodiment, cross-task navigation basis mannequin. Developed in collaboration with Tsinghua College, Peking College, College of Adelaide, and Zhejiang College, these improvements mark a major leap ahead in robotics, enabling robots to autonomously carry out advanced duties throughout numerous environments.

    DexNDM: A Recreation-Changer in Dexterous Manipulation

    DexNDM (Dexterous Hand Neural Dynamics Mannequin) is a groundbreaking neural dynamics mannequin that empowers robots to carry out high-precision in-hand rotations of objects with various sizes and geometries. From tiny, delicate elements to massive, irregular objects, DexNDM adapts to dynamic forces and ranging wrist orientations, fixing a longstanding problem in dexterous manipulation. This innovation pushes the boundaries of what robots can obtain in wonderful motor abilities, providing unparalleled flexibility and flexibility.

    Key highlights of DexNDM:

    Unprecedented Generalization Throughout Object Varieties: In contrast to conventional fashions restricted to particular object sizes or shapes, DexNDM can deal with a variety of objects, from microelectronics to bigger, extra advanced geometries (reminiscent of books or elongated instruments), breaking previous limitations in robotic manipulation.
    Multi-Axis, Multi-Posture Rotation: DexNDM excels at rotating objects alongside a number of axes, whatever the hand’s orientation. Whether or not the hand is going through up, down, or sideways, the mannequin ensures secure and exact rotations. This functionality permits robots to carry out advanced duties, reminiscent of adjusting the grip and rotating objects for exact insertion or meeting, even in hard-to-reach positions.
    Extremely Dexterous Teleoperation System: Galbot has built-in DexNDM right into a teleoperation system, enabling robots to carry out advanced duties reminiscent of screw tightening, hammering nails, and long-range meeting. This method permits human operators to challenge high-level instructions whereas the robotic autonomously handles exact finger actions, making certain unmatched flexibility and robustness.
    Knowledgeable-to-Generalist Studying Paradigm: DexNDM makes use of a singular expert-to-generalist studying strategy. The mannequin first trains specialised insurance policies for dealing with particular duties and object varieties after which distills these insurance policies right into a unified, adaptable technique. This strategy permits DexNDM to seamlessly carry out duties throughout numerous robotic varieties and environments with minimal fine-tuning.
    DexNDM represents a major milestone within the journey from robotic dexterity to real-world productiveness, enabling robots to carry out duties like instrument use, meeting, and fine-grained manipulation in industries like manufacturing, healthcare, and logistics.

    NavFoM: A Leap in Autonomous Navigation

    NavFoM (Navigation Basis Mannequin) is the world’s first cross-embodiment, cross-task navigation basis mannequin that allows robots to autonomously navigate throughout a variety of environments and duties. Developed by Galbot and its analysis collaborators, NavFoM consolidates fragmented navigation applied sciences right into a unified system able to seamlessly integrating notion, understanding, decision-making, and motion.

    Key highlights of NavFoM:

    All Environments: NavFoM helps each indoor and outside navigation, enabling zero-shot operation in unseen environments with out requiring pre-mapping or extra coaching knowledge. Robots powered by NavFoM can navigate new environments with out prior information, considerably decreasing deployment effort and time. Whether or not navigating a cluttered warehouse, crowded public area, or dynamic outside terrain, NavFoM can seamlessly adapt to every state of affairs.
    Cross-Job: NavFoM can carry out a variety of navigation duties, from target-following (e.g., following an individual) to autonomous navigation (e.g., navigating by way of a busy city space). It additionally works with pure language instructions, permitting human operators to challenge easy directions, reminiscent of “observe that individual” or “navigate to the purple automobile.”
    Cross-Embodiment: One among NavFoM’s standout options is its means to adapt to numerous robotic varieties, together with quadrupeds, wheeled humanoids, drones, and even automobiles. This cross-embodiment adaptability ensures that NavFoM will be utilized throughout completely different robotic varieties, enhancing scalability and improvement effectivity. Whether or not you are navigating an indoor robotic, a drone, or an autonomous car, NavFoM delivers constant efficiency.
    Unified Navigation Paradigm: NavFoM introduces a novel technique for processing video streams and textual content instructions to generate exact motion trajectories, unifying historically fragmented navigation fashions. This enables robots to see, perceive, and act in actual time throughout duties and environments. As an example, a quadruped robotic’s means to keep away from crowds in a shopping center can inform a drone’s means to navigate obstacles within the air, growing effectivity and accuracy in unfamiliar environments.
    With NavFoM, Galbot has redefined the logic of navigation. Robots powered by NavFoM can understand their environment, plan paths, and make real-time choices in fully unknown environments with minimal human intervention.

    Actual-World Purposes: Unlocking New Potentialities with DexNDM and NavFoM

    Each DexNDM and NavFoM signify twin breakthroughs in embodied AI, pushing the boundaries of robotic capabilities in manipulation and navigation. DexNDM enhances robots’ dexterous manipulation talents, enabling high-precision, multi-step duties involving instruments and wonderful manipulation, whereas NavFoM redefines how robots perceive and navigate their environments with autonomy and flexibility.

    Collectively, these fashions present a complete resolution for real-world functions throughout industries, from meeting and power use in manufacturing to navigation and autonomous decision-making in dynamic environments.

    A Future Powered by Embodied Intelligence

    Galbot’s NavFoM and DexNDM convey us nearer to a world the place robots not solely perceive their atmosphere however can carry out advanced duties with precision and autonomy. These improvements are poised to redefine industries, together with manufacturing, logistics, healthcare, retail, and past, enabling robots to function in real-world environments with unprecedented adaptability and effectivity.

    For extra info on DexNDM, go to the venture web site: www.meowuu7.github.io/DexNDM

    For extra info on NavFoM, go to the venture web site: https://pku-epic.github.io/NavFoM-Internet/

    About Galbot

    Based in 2023, Beijing Galbot Co., Ltd. is a worldwide chief in embodied AI and general-purpose robotics. Galbot’s flagship product, the Galbot G1, has been broadly deployed, reaching over one 12 months of confirmed, secure real-world operations throughout industries reminiscent of manufacturing, logistics, healthcare, and retail. The Galbot G1 is at present working greater than 20 absolutely autonomous Galbot Shops throughout over 20 cities, with plans to broaden to 100 shops by the tip of the 12 months.

    With over $400M in funding, Galbot is reworking industries by setting new benchmarks in automation, delivering options that drive operational effectivity, scalability, and productiveness throughout international sectors.

    Share. Facebook Twitter Pinterest LinkedIn Tumblr Email
    Arjun Patel
    • Website

    Related Posts

    Why the gripper is the true interface between AI and the bodily world

    March 13, 2026

    Coding for underwater robotics – Robohub

    March 13, 2026

    Epson Robots to Showcase Scalable Retail Automation at Shoptalk Spring 2026

    March 12, 2026
    Top Posts

    Evaluating the Finest AI Video Mills for Social Media

    April 18, 2025

    Utilizing AI To Repair The Innovation Drawback: The Three Step Resolution

    April 18, 2025

    Midjourney V7: Quicker, smarter, extra reasonable

    April 18, 2025

    Meta resumes AI coaching utilizing EU person knowledge

    April 18, 2025
    Don't Miss

    Prime LiDAR Annotation Corporations for AI & 3D Level Cloud Knowledge

    By Declan MurphyMarch 13, 2026

    It exhibits the widespread use of Mild Detection and Ranging (LiDAR) applied sciences. LiDAR is…

    Feds Dismantle SocksEscort Proxy Community Utilized in World Fraud

    March 13, 2026

    NYT Pips hints, solutions for March 13, 2026

    March 13, 2026

    Steve Yegge Desires You to Cease Taking a look at Your Code – O’Reilly

    March 13, 2026
    Stay In Touch
    • Facebook
    • Twitter
    • Pinterest
    • Instagram
    • YouTube
    • Vimeo

    Subscribe to Updates

    Get the latest creative news from SmartMag about art & design.

    UK Tech Insider
    Facebook X (Twitter) Instagram
    • About Us
    • Contact Us
    • Privacy Policy
    • Terms Of Service
    • Our Authors
    © 2026 UK Tech Insider. All rights reserved by UK Tech Insider.

    Type above and press Enter to search. Press Esc to cancel.