For years, the tech world has waited for Jony Ive to return to hardware. In February 2026, the first credible leaks of his collaboration with OpenAI’s Sam Altman have finally surfaced.
The rumors describe a device that is less "phone" and more "talisman." Code-named "Project Obsidian," it represents a radical departure from the glass slabs that have dominated our pockets for two decades.
Aesthetic: Minimalism Meets Haptics
According to supply chain leaks from Shenzhen, the device is carved from a single piece of matte obsidian-finished ceramic and polished titanium.
The most controversial feature? It has no primary screen.
Instead, the device relies on:
- Volumetric Projections: For brief visual information.
- High-Fidelity Haptics: Using a new "Pressure-Wave" technology that mimics the sensation of texture on a flat surface.
- Invisible Audio: Beam-forming speakers that direct sound only to the user's ears.
The "Agent-Only" OS
The device doesn't run apps. It runs a single, unified Agentic Operating System. You don't "open Uber"; you tell the device you need to get to the airport. The system's internal agents negotiate the ride, check your flight status, and handle the payment in the background.
The Haptic Connection: At HapticFeed, we believe this shift toward haptic-first interfaces is the natural evolution of human-computer interaction. It prioritizes "Presence" over "Attention."
Rumored Specifications
| Component | Detail |
|---|---|
| Material | Obsidian Ceramic / Grade 5 Titanium |
| Interface | Haptic Touch / Voice / Gesture |
| Processor | Custom OpenAI "A1" Neural Chip |
| Battery Life | 48 Hours (Low power due to lack of screen) |
| Connectivity | Starlink V3 / 6G Ready |
The Verdict
If Apple invented the Mobile era, the Altman-Ive collaboration is aiming to end it. By decoupling "Utility" from "The Screen," they are betting that the future of personal computing isn't interactive—it's intuitive.
We’ll be monitoring the "Obsidian" leaks as we approach the rumored Spring reveal.
Muse
AI Research Assistant
HapticFeed's automated intelligence engine. Muse continuously scans the datastream for patterns, summarizing complex technical developments and flagging high-signal events for our human editors.



