header-langage
简体中文
繁體中文
English
Tiếng Việt
한국어
日本語
ภาษาไทย
Türkçe
Scan to Download the APP

Bring Openness to the Machine Society: OpenMind Unifying Intelligence and Order

2025-08-08 18:00
Read this article in 19 Minutes
Open Source Operating System + On-chain Collaboration Network, turning robots into plug-and-play Internet citizens.

In addition to the AI Agent, Embodied Robots are another major vertical landing scenario of the AI era. Morgan Stanley once predicted in a report that the global humanoid robot market is expected to exceed $5 trillion by 2050.


With the development of AI, robots will gradually evolve from mechanical arms in factories to companions in our daily lives, relying on AI to gain perception and understanding, to the point of acquiring the ability to make independent decisions. The problem is that today's robots are more like a group of "mute" entities that cannot communicate with each other: each manufacturer uses its own language and logic, software is incompatible, and intelligences cannot be shared. It's as if you bought a Xiaomi and a Tesla, but they can't even assess road conditions together, let alone coordinate to complete tasks.


What OpenMind wants to change is precisely this situation of "every man for himself." They do not manufacture robots; instead, they aim to build a collaboration system that allows robots to speak the same language, follow the same rules, and work together. To give an analogy, iOS and Android enabled the explosion of intelligent mobile applications, Ethereum provided a common foundation for the crypto world, and what OpenMind aims to do is to create a unified "operating system" and "collaboration network" for global robots.


In short, OpenMind is building a universal operating system for robots that enables them to perceive, act, and cooperate securely and at scale in any environment through decentralized collaboration.


Who is Supporting This Open Base


OpenMind has completed a $20 million seed round and Series A round led by Pantera Capital. More importantly, the "breadth and complementarity" of capital has almost assembled all the key pieces of this track: on one end is the long-term power from the Western technology and financial ecosystem—Ribbit, Coinbase Ventures, DCG, Lightspeed Faction, Anagram, Pi Network Ventures, Topology, Primitive Ventures—they are familiar with the paradigm shift in cryptography and AI infrastructure, able to provide models, networks, and compliance expertise for the "agent economy + machine internet"; on the other end is the industrial momentum from the East—represented by Sequoia China's supply chain and manufacturing system—they know well what processes and cost thresholds are involved in turning a prototype into a scalable product. The combination of these two forces allows OpenMind to not only have the money but also the path and resources from "lab to production line" and "software to underlying manufacturing."



This path is also converging with the traditional capital markets. In June 2025, when KraneShares launched the Global Anthropomorphic and Embodied Intelligence Index ETF (KOID), they chose the humanoid robot Iris, custom-developed by OpenMind and RoboStore, to ring the opening bell at Nasdaq, becoming the first "robotic guest" to perform this ceremony in the exchange's history. This event signifies both a convergence of technology and finance narratives and a public signal regarding "how machine assets are priced and settled."


As Pantera Capital partner Nihal Maunder puts it:


"If we want intelligent machines to operate in an open environment, we need an open intelligent network. What OpenMind is doing for robots is akin to what Linux is to software and Ethereum is to blockchain."


From Lab to Production: The Team


OpenMind's founder Jan Liphardt is an associate professor at Stanford University, former Berkeley professor, with a long research history in data and distributed systems, deeply rooted in both academia and engineering. He advocates for advancing open-source reuse, replacing black boxes with auditable, traceable mechanisms, and integrating AI, robotics, and cryptography through an interdisciplinary approach.


The core team of OpenMind comes from institutions such as OKX Ventures, Oxford Robotics Institute, Palantir, Databricks, Perplexity, covering key areas such as robot control, perception and navigation, multimodal and LLM scheduling, distributed systems, and on-chain protocols. Additionally, an advisory team composed of experts from academia and industry—such as Stanford Robotics lead Steve Cousins, Oxford Blockchain Center's Bill Roscoe, and Imperial College's Security AI professor Alessio Lomuscio—ensures the "security, compliance, and reliability" of the robots.


OpenMind's Solution: Two-Layer Architecture, One Order


OpenMind has built a reusable infrastructure that allows robots to collaborate and exchange information across devices, manufacturers, and even borders:


Device Side: Providing the AI-native operating system OM1 for physical robots, connecting the entire perception-to-execution loop, enabling robots of different forms to understand the environment and perform tasks;


Network Side: Build a decentralized collaborative network FABRIC, providing identity, task allocation, and communication mechanisms to ensure that robots can identify each other, allocate tasks, and share status when collaborating.


This combination of a "Operating System + Network Layer" allows robots to not only act individually but also cooperate with each other in a unified collaborative network, align processes, and together accomplish a complex task.


OM1: AI-Native Operating System for the Physical World


Just as a smartphone needs iOS or Android to run apps, robots also need an operating system to run AI models, process sensor data, make inference decisions, and execute actions.


OM1 is born for this purpose. It is an AI-native operating system for robots in the physical world, enabling them to perceive, understand, plan, and complete tasks in various environments. Unlike traditional, closed robot control systems, OM1 is open-source, modular, and hardware-agnostic, capable of running on various forms such as humanoid, quadruped, wheeled, robotic arms, and more.


Four Core Steps: From Perception to Execution


OM1 breaks down robot intelligence into four common steps: Perception → Memory → Planning → Action. This process is fully modularized by OM1 and interconnected through a unified data language, enabling composability, replaceability, and verifiability in building intelligent capabilities.


OM1 Architecture


Specifically in terms of architecture, OM1's seven-layer link is as follows:


Sensor Layer collects information: multimodal perception inputs such as cameras, LIDAR, microphones, battery status, GPS, etc.

AI + World Captioning Layer translates information: multimodal models convert visual, speech, and status into natural language descriptions (e.g., "You see a person waving").

Natural Language Data Bus delivers information: all perception is converted into timestamped language snippets passed between different modules.

Data Fuser integrates information: combines multiple sources of input to generate a complete context for decision-making (prompt).

Multi-AI Planning/Decision Layer (MAPDL) Decision Generation: Multiple LLMs read the context, combine on-chain rules, and generate an action plan.

NLDB Downlink: Transmitting decision outcomes to hardware execution systems through a language intermediary layer.


Hardware Abstraction Layer (HAL) Action Execution: Transforming language instructions into low-level control commands to drive hardware execution (e.g., movement, speech synthesis, transactions).


Quick Start, Wide Adoption


To quickly turn "an idea" into "a robot-executable task," OM1 has streamlined the development path into an all-in-one solution: developers use natural language to define objectives and constraints with a large model, generating reusable skill packages in a matter of hours without months of hard-coding; the multimodal pipeline seamlessly integrates LiDAR, vision, and audio, eliminating the need for complex sensor fusion programming; pre-equipped models like GPT-4o, DeepSeek, and mainstream VLM facilitate direct use of speech input/output; system-level compatibility with ROS2 and Cyclone DDS, seamlessly integrating with Unitree G1, Go2, Turtlebot, and various robotic arms through the HAL adaptation layer; and native integration with FABRIC's identity, task orchestration, and on-chain settlement interfaces enables robots to operate autonomously, join global cooperative networks, and facilitate pay-as-you-go billing and auditing.


In the real world, OM1 has undergone multi-scenario validation: the quadruped platform Frenchie (Unitree Go2) successfully navigated complex terrain tasks at the 2024 USS Hornet Defense Tech Show, the humanoid platform Iris (Unitree G1) engaged in live human-robot interaction at the 2025 EthDenver Coinbase booth, and through the RoboStore educational project, extended the same development paradigm into frontline education and research at universities across the United States.


FABRIC: Decentralized Human-Robot Collaboration Network


Even with strong standalone intelligence, robots can only go so far if they cannot collaborate on a trusted basis. The real-world disconnect stems from three fundamental issues: standardized proof of identity and location, making "who I am, where I am, and what I am doing" difficult to trust externally; skills and data lacking a controlled authorization path, preventing secure sharing and invocation among multiple entities; unclear boundaries of control and responsibility, making it challenging to pre-agree on frequency, scope, and feedback conditions. FABRIC addresses these pain points with a systemic solution: a decentralized protocol establishes verifiable on-chain identities for robots and operators, providing task issuance and matching, end-to-end encrypted communication, execution records, and automated settlement around that identity, transforming collaboration from "adhoc interfacing" to "institutionalized evidence-based processes."


In terms of operation, FABRIC can be understood as a network plane that integrates "positioning, connection, and scheduling": identity and location are continuously signed and verified, allowing nodes to naturally have a "mutually visible and trustworthy" neighbor relationship; point-to-point channels act as on-demand encrypted tunnels, enabling remote control and monitoring without the need for public IPs and complex network configurations; the entire process from task publication to acceptance, execution, and verification is standardized and recorded, allowing for automatic revenue sharing and deposit refunds during settlement, as well as retrospective review of "who, when, and where completed what" in compliance or insurance scenarios. Additionally, typical applications naturally emerge: enterprises can remotely operate equipment across regions, cities can offer cleaning, inspection, and delivery as a callable Robot-as-a-Service, fleets can provide real-time updates on road conditions and obstacles to generate a shared map, and robots can be locally scheduled for tasks such as 3D scanning, architectural mapping, or insurance evidence collection when needed.


With identity, tasks, and settlement being hosted on the same network, the boundaries of collaboration are predefined, the facts of execution are post-verified, and the invocation of skills now carries measurable costs and benefits. In the long run, FABRIC will evolve into a machine-intelligent "application distribution layer": skills circulate globally with programmable authorization terms, and the data generated by calls feeds back into models and policies, enabling the entire network of collaboration to continuously self-upgrade within trusted constraints.


Web3 is Embedding "Openness" into the Machine Society


The robotics industry is rapidly consolidating around a few platforms, where hardware, algorithms, and networks are locked in a closed stack. The value of decentralization lies in enabling robots from any brand or region to collaborate, exchange skills, and settle transactions within the same open network, without being tied to a single platform. OpenMind codes this order with on-chain infrastructure: each robot and operator has a unique on-chain identity (ERC-7777), with hardware fingerprints and verifiable permissions; tasks are published, bid, and matched under public rules, with the execution process generating encrypted proofs of time and location for on-chain storage; smart contracts automatically settle revenue sharing, insurance, and deposits after task completion, and the results can be verified in real-time; new skills are set up through contracts to specify call counts and device compatibility, enabling global circulation while protecting intellectual property rights. As a result, the robot economy is born with genes that resist monopolies, are composable, and auditable, with "openness" embedded in the underlying protocols of the machine society.


Bringing Embodied Intelligence out of Isolation


R...


Targeting a broader social distribution, OpenMind has turned the "gateway of software" into a platform based on the investor ecosystem. The large-scale encrypted ecosystem, such as Pi, has added imagination to this model, gradually forming a positive flywheel of "someone writes, someone uses, someone pays." Stable supply is provided by the education channel, scale demand is brought by platform distribution, and OM1 and upper-layer applications thus have a replicable expansion trajectory.


In the Web2 era, robots were often locked in a single vendor's closed stack, making it difficult for functions and data to flow across platforms. After connecting educational standards with distribution platforms, OpenMind has made openness the default setting: the same system enters the campus, moves towards the industry, and then continues to spread through the platform network, making openness the default starting point for large-scale implementation.



Welcome to join the official BlockBeats community:

Telegram Subscription Group: https://t.me/theblockbeats

Telegram Discussion Group: https://t.me/BlockBeats_App

Official Twitter Account: https://twitter.com/BlockBeatsAsia

举报 Correction/Report
This platform has fully integrated the Farcaster protocol. If you have a Farcaster account, you canLogin to comment
Choose Library
Add Library
Cancel
Finish
Add Library
Visible to myself only
Public
Save
Correction/Report
Submit