2016 sketch done with the paper application on iPad of a case management system and conversation, user interface, flow, and user interface components that were part of the conversation

Every time a new iPad is released, or iPadOS is updated, much of the conversation turns to “but can it replace your computer?” I haven’t liked that one bit… and haven’t just replaced laptop/desktop usages, I’ve done bits with its malleable, direct input canvas that (as Steve Jobs noted) fit in between the laptop and mobile modes of work.

Attached to this is a note of a conversation - which while solutioning user flows, animed to answer user experience aspects the conversation missed. This conversation spanned the better part of an hour, and I was there only listening. But, par the use of the iPad, I could listen, scribble, design, and even test assumptions w/o leaving the space of being immersed in what they were trying to solve.

The point of the iPad isn’t to replace your mobile or laptop, as much as it is to help you utilize more senses than a few fingers - directly transcribing or scribbling what’s in your head with fewer intermediaries. That canvas is what’s missing from some of Apple’s execution of the iPad, sure. But also from many of its media audience.

It’s not about making more noise, but helping you build better behaviors to discern noise from noise from signal. And when possible, share to others the pieces they intend to decide, sometimes with visual depth they cannot yet get to themselves. That type of “thinking different” is where I’ve been with tablets since before the iPad.

If you are thinking computing should have evolved long ago, far ahead from clicking and consumption to something more, shouldn’t you expect and behave different? And, shouldn’t your canvas provoke such a focus?