Ex-Apple Designer Reveals 'Living Glass' iOS 26 Concepts

Ex-Apple Designer Reveals 'Living Glass' iOS 26 Concepts

In a bold and imaginative leap forward, an ex-Apple designer has unveiled a series of groundbreaking concept designs dubbed “Living Glass,” aimed at reimagining the user interface of the upcoming iOS 26. These concepts take Apple’s long-standing design ethos of simplicity and elegance and fuse it with a futuristic, AI-driven dynamic system that makes the iPhone interface feel truly alive. While these designs are not official Apple creations, they have ignited widespread discussions among designers, tech enthusiasts, and Apple loyalists.

The “Living Glass” concept doesn't merely refine existing elements—it redefines how users might interact with their devices in the post-AI era. Here’s a deep dive into what makes these iOS 26 concepts so visionary and how they reflect not only Apple’s legacy but also its possible future.


The Origins: From Cupertino to Conceptual Visionary

The designer behind these ideas, Thomas Eldridge, spent over a decade at Apple, contributing to projects across iOS and macOS. Known for his minimal yet expressive design sense, Eldridge left Apple in 2023 to pursue independent work at the intersection of interface design and generative AI. He started posting speculative designs on his blog and social media, and what began as digital sketches soon gained traction among followers hungry for a glimpse of what could be.

His latest work, “Living Glass,” is described as a "conceptual design language" for iOS 26 that blurs the line between the interface and the user. It builds on Apple’s real-world trend toward more personalized, adaptive systems like Apple Intelligence, introduced with iOS 18, and takes it a step further.


What Is “Living Glass”?

“Living Glass” is not a single feature, but a combination of ideas that form a dynamic UI layer atop the traditional iOS framework. It proposes a living, breathing user interface—one that morphs, adapts, and evolves based on user context, habits, and environment. The design system is based on three pillars:

  1. Ambient Adaptability

  2. Dynamic Depth

  3. Conversational Control

Let’s explore each of these in detail.


1. Ambient Adaptability: The Interface That Knows You

The concept introduces UI elements that can subtly shift their appearance and function depending on environmental cues, such as lighting, location, and even the user’s emotional tone detected through voice. For instance:

  • Home Screen Color Temperature: The home screen subtly shifts in color tone to match your ambient environment. On a sunny day at the beach, icons take on a warm hue; at night, they cool down with a soft, bluish tint.

  • Mood Detection: Using voice input, facial cues, and typing cadence, the system can estimate the user’s mood and respond accordingly. If you’re stressed, your iPhone may switch to a simplified UI and suggest a focus mode or mindfulness app.

This sort of adaptiveness resembles features found in the Apple Watch, like blood oxygen detection and mood journaling, but transposed into a visual, interactive experience across the whole system.


2. Dynamic Depth: UI That Moves With You

Perhaps the most striking part of the “Living Glass” concept is its use of motion and spatial depth. Inspired by the parallax and layered effects introduced back in iOS 7, this system expands the idea by introducing “living layers” that respond to gesture, gaze, and device orientation.

  • Fluid App Cards: Instead of static multitasking windows, apps appear as fluid glass panels that ripple as you swipe between them, adding a tactile feel to the interface.

  • Interactive Widgets: Widgets aren’t just glanceable—they’re stretchable and compressible. Pinching a calendar widget, for example, might zoom it out into a full event manager. Letting go collapses it into a single date with a subtle ripple effect.

  • Eye-Tracking Navigation: For devices equipped with advanced front-facing sensors, the system can follow your gaze and shift UI elements accordingly, offering accessibility enhancements and futuristic control schemes.

Eldridge describes this as “glass that doesn’t reflect light—it reflects you.”


3. Conversational Control: AI as a Design Layer

Building on Apple Intelligence, the concept envisions a system where the AI not only assists but designs with you. This isn’t just Siri answering questions—it’s a design co-pilot for your daily interactions.

  • Live Interface Editing: Want to move a button, change a widget color, or adjust your home screen layout? Just say, “Move this app to the top and make it a little bigger.” The system instantly responds and previews the change.

  • Semantic Shortcuts: Natural language replaces fixed commands. You can say, “Show me today’s news in blue tone and play jazz until 9,” and the interface shifts accordingly, merging visual aesthetics with task execution.

  • Adaptive App Themes: The system learns your preferences and applies AI-generated themes across apps. For instance, if you often use a dark theme in reading apps, it might suggest a consistent visual style across Mail, Safari, and Notes.

This conversational interface repositions the iPhone as less of a static tool and more of a collaborative assistant.


Apple DNA in Every Pixel

Despite the experimental nature of the concepts, Eldridge’s designs are steeped in Apple tradition. The rounded corners, flat iconography, and focus on touch-first interaction remain intact. But the innovation lies in how these elements are brought to life through real-time responsiveness and personalization.

There’s also a heavy emphasis on privacy—a nod to Apple’s core values. In his write-up, Eldridge insists that all context-aware features run on-device, echoing Apple’s direction with on-device processing for AI tasks. He even proposes a “Transparency Layer,” a UI element that appears whenever your data is being used to inform design changes, giving you the option to opt-out or inspect how the system is adapting.


Community and Designer Reactions

Designers and developers across the web have reacted enthusiastically to the “Living Glass” concept. Prominent UI/UX professionals have praised it for being “a genuine evolution of the iOS interface paradigm,” and some are already experimenting with similar animations and adaptive themes in SwiftUI mockups.

On Reddit’s r/Apple and r/iOSBeta, users were captivated by the fluidity of Eldridge’s designs. One user commented, “This is what Apple would do if they weren’t constrained by hardware cycles and conservative updates.” Others speculated whether parts of “Living Glass” could appear in the rumored iOS 26, especially as Apple begins to more deeply integrate Apple Intelligence across all devices.


Could Apple Adopt These Ideas?

While Apple is famously secretive, there’s a precedent for hiring back former employees or licensing independent innovation. It’s not outlandish to imagine that concepts like dynamic widgets, real-time customization, or mood-adaptive interfaces could influence future iOS releases.

There are also signs Apple is already thinking along these lines. In 2024, they filed several patents for “emotion-aware interfaces,” “contextual display adaptation,” and “semantic input engines”—all of which align with elements of the “Living Glass” concept.

Moreover, as iPhones become more powerful—with chips like the A19 Pro rumored to focus on AI workloads—there’s less technical limitation standing in the way of these sorts of dynamic, intelligent UIs.


The Future of Interface: From Static to Sentient

“Living Glass” isn’t just a flashy concept—it points to a larger trend in software design. As devices become smarter and more aware, users expect more than just speed and efficiency. They want devices that understand them, adapt to them, and even grow with them.

The iOS interface has remained largely consistent since iOS 7. Each update brings refinements, but the metaphor of icons on a grid has stayed intact. “Living Glass” challenges this notion, asking: What if the interface could breathe?

As Apple continues to evolve into an AI-first company—with Apple Intelligence being a cornerstone of recent announcements—the idea of an interface that reflects your mood, context, and preferences feels less speculative and more inevitable.



Closing Thoughts

While we may never see the exact designs of “Living Glass” implemented in iOS 26, Thomas Eldridge’s concept has opened a vibrant conversation about what the next generation of mobile interfaces could look like. It's a compelling vision—one where our devices don't just respond to taps and swipes, but to who we are and how we feel.

As we move deeper into the AI age, the interface will become less about screens and icons, and more about interactions, intuitions, and relationships. “Living Glass” offers a glimmer of that future—a future where your iPhone isn’t just a tool, but a companion that lives and breathes with you.

Previous Post Next Post