Multi-paradigm, not multi-modal
Multi-paradigm, not multi-modal
Feb 20, 2025
Every AI app defaults to chat. Type your prompt, get a response. Rinse, repeat. Is this really the best we can do?
There are three ways we can interface with AI, each with its own trade-offs:
Text or voice chat interfaces are powerful because they match how AI models actually work. They're flexible and open-ended. But they're also unpredictable. Slight changes in wording–or minor changes in models and prompts–can lead to drastically different results. Frustratingly, you discover their limitations through trial and error.
Traditional UIs feel familiar because they build on decades of design evolution that set clear expectations and boundaries. But add too many features and they become cluttered and hard to navigate. This leads to re-designs that force users to relearn their workflows.
Adaptive interfaces promise to mold themselves around user needs. Always exciting in theory but they become disorienting when users can't build consistent patterns of interaction.
At thatworks.ai, we're taking a different approach. We start with familiar, focused interfaces to help users get things done quickly. As complexity grows, we selectively switch to turn-based interaction where the cognitive overhead makes sense, or adaptive elements (like our automated summaries) when users can define their own boundaries.
The future of interaction isn't about choosing one paradigm. It's about building in fluidity to know when to use each one effectively. Our AI products don't just need to be multi-modal but multi-paradigm.
Every AI app defaults to chat. Type your prompt, get a response. Rinse, repeat. Is this really the best we can do?
There are three ways we can interface with AI, each with its own trade-offs:
Text or voice chat interfaces are powerful because they match how AI models actually work. They're flexible and open-ended. But they're also unpredictable. Slight changes in wording–or minor changes in models and prompts–can lead to drastically different results. Frustratingly, you discover their limitations through trial and error.
Traditional UIs feel familiar because they build on decades of design evolution that set clear expectations and boundaries. But add too many features and they become cluttered and hard to navigate. This leads to re-designs that force users to relearn their workflows.
Adaptive interfaces promise to mold themselves around user needs. Always exciting in theory but they become disorienting when users can't build consistent patterns of interaction.
At thatworks.ai, we're taking a different approach. We start with familiar, focused interfaces to help users get things done quickly. As complexity grows, we selectively switch to turn-based interaction where the cognitive overhead makes sense, or adaptive elements (like our automated summaries) when users can define their own boundaries.
The future of interaction isn't about choosing one paradigm. It's about building in fluidity to know when to use each one effectively. Our AI products don't just need to be multi-modal but multi-paradigm.