# Roadmap

This roadmap details the strategic phases for developing such a model, starting with distinct yet related branches in Augmented Reality (AR) and Robotics, culminating in a single, integrated hybrid space platform.

The phases and milestones represent technologies required to reach the end goal of Zeno World, and there will be some side-line products that will be built along with the journey as part of technology unlocking.

<div data-full-width="false"><figure><img src="https://1019789088-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FmcqNuybmEO4SOyYE0qZJ%2Fuploads%2FcNzCvoQVIoJyMq1AKBFk%2Fdiagram.jpg?alt=media&#x26;token=1eac1ca0-b587-4837-b6c7-18d78f17118d" alt=""><figcaption></figcaption></figure></div>

#### Phase 1: Foundation Establishing of Spatial Technology

The initial phase focuses on building robust foundations in two critical areas: Embodied AI (enhanced human), Augmented Reality (human enhancement). Both streams will operate with a strong emphasis on understanding and mapping physical space.

**Stream A: Embodied AI and Physical Interaction**

This branch focuses on enabling robots to perceive, understand, and interact intelligently with physical space.

**Milestone A.1: Localization Modular Component**

* Localization Modular Component (LMC) that is capable of localization, planning and navigation in a large-scale physical space.
* Distributed spatial mapping to support LMC localization.
* **Deliverable**: Use LMC to demo robot dogs for more sophisticated tasks without remote controls.

**Milestone A.2: Human-Physical Agents Collaboration**

* Enhance robots' understanding of semantic context and facilitate natural human-robot interaction.
* Robot assistant capable of understanding and executing multi-step commands and collaborating with humans on simple tasks.
* **Deliverable**: Demo rich interaction between human and robot dogs without remote controls.

**Stream B: Spatial Mapping & Augmented Reality**

This branch is dedicated to enriching human experience by overlaying digital information into the physical world.

**Milestone B.1: Spatial Mapping and Anchored Zones**

* High-precision, real-time spatial mapping capabilities for diverse environments.
* AR applications capable of stable object placement and re-localization in an anchored zone.
* Capability of persistent placement and multi-users co-localization.
* **Deliverable:** SDK and demo experience for playing anchored zone.

**Milestone B.2: Scalable Global Zones**

* Global search for vision localization systems in a distributed and scalable way.
* Crowd sourced spatial mapping and self-growing/recovering mechanisms.
* **Deliverable:** SDK and demo experience for playing global zone and crowd source contribution.

**Milestone B.3: Dynamic Content & Interaction**

* Real-time recognition and semantic understanding.
* Interaction with dynamic digital content with beacon devices.
* **Deliverable:** Upgraded experience with the SDK and Demos.

#### Phase 2: Convergence - Connect humans, robots and agents

This marks the merging individual advancements to create a unified understanding and interaction model.

**Milestone C.1: Shared World Mapping with Robots and AR (Human)**

* Unified spatial mapping and digital twin of the physical environment, accessible and understandable by both humans (via AR) and robots.

**Milestone C.2: Interaction & Co-located Experiences**

* Enable seamless, co-located interaction where actions in the physical world affect digital agents.
* Enable digital agents to affect the physical world by IoT integration.

**Deliverable:** A scenario where a robot and a human (wearing AR) collaboratively interact with both physical and superimposed digital objects in the same physical space.

#### Phase 3: Zeno Platform and Open World - The End Goal

The final phase solidifies the unified World Model into a comprehensive, extensible platform that serves as the "hybrid space" for humans, robots, and other agents.

**Milestone D.1: Zeno Open Platform**

* Consolidate all functionalities into a scalable "Grand World Model" capable of generation, creation, multi-reality superimposition, and real-world entity control.
* API and SDK for applications to build blended experience across all kinds of agents.
* Tools and Editor application developer to manipulate a blended world in physical space and digital overlay.
* **Deliverable:** A functional "Zeno World" API, SDK and tools, demonstrating initial capabilities for generating dynamic digital content and controlling robotic agents within a single, unified environment.

**Milestone D.2: Zeno World**

* Zeno World is a standalone "game" allowing playing and creation of digital content in the physical space that can be interacted with human players and robots everywhere on the earth.
* World engine to make digital content and agent programable and durable.
* Content or agent store of Zeno World for creators to share and distribute their creations in physical space for monetization.
* **Deliverable:** The Zeno World and its ecosystem.

<figure><img src="https://1019789088-files.gitbook.io/~/files/v0/b/gitbook-x-prod.appspot.com/o/spaces%2FmcqNuybmEO4SOyYE0qZJ%2Fuploads%2FsrFS114gBzfXjwnwUKcK%2Ffooter.jpg?alt=media&#x26;token=2f720d21-8738-4fbf-95ac-f6b102a82b36" alt=""><figcaption></figcaption></figure>
