Avanceé.Agency

Musings on designing experiences & (re)engineering complexity

Jun 2024

Updates to Everysight Maverick Dev Edition Experiment

The Everysight Maverick Developer Edition is a pair of augmented reality/heads up display (HUD) glasses, which were released in order to help the company establish a solid foundation with developers before commercial releases would happen. We purchased these in anticipation of understanding the hardware, getting some sense of its usability, and potentially working with the software development kit ((SDK)). this post serves as something of an update to the research, some of the lessons learned, and a bit of a wish list for this and similar products.

glasses and smart phone sitting on top of an ottoman. Glasses have the arms folded behind them. iPhone 15 Pro has application setting screen open

Research Update

IMO, the Everysight Maverick Developer Edition needs to be more than just a heads up display (HUD) of another device one carries. Or, more specifically, your contextual awareness needs to extend further than simply knowing metrics. For example:

  • connecting to devices showing rearward traffic that’s approaching you
  • marking routes as fast, slow, frequently traveled when connected to a phone or other device that has GPS in shows a similar route.
  • when you see a pothole or some type of road obstruction, and you can mark it as such (tap the side of the glasses in this case) and then it’s shared to all the other mapping and activity tracking services (either immediately because your glasses are connected to a computer that does live syncing after the route is done and you go back online)

Right now, these are only good enough for showing metrics, but augmentation can likely go further. There’s just enough in Everysight’s example applications to give a peek that there’s more which can be done.

The reliance on a mobile (iPhone or android) also bugs me. Why can’t it be easier to have it pair to an Apple Watch? What am I missing in terms of the knowledge or ability to make this the direct connection, not an edge case? Connecting directly to any wearable feels like it should be table stakes, even though the best benefit (for those who’d white label this) would be the usages found thru a dedicated API which is (right now) just done in a mobile app.

If using these without prescription lenses (as I am currently) what can the augmentation begin to look or sound like:

  • Using voice and visual prompts for road and whether hazards
  • ghost views of previous roots and your timing if you’re traveling similar routes (how would the device do this without? GPS is still something I’m working out)

Because these glasses do not have a camera or other type of sensory mechanism, they would have a lower value for being used unless connected to another device. However, within the small display, there is still plenty of room for extending perceptive information. At the same time, because these do have that reliance, someone may decide to use these glasses instead of a connected device for an activity, and therefore any ability for the device to save time and speed, not necessarily directions/location, could prove worthwhile in higher security situations.

Wish List of Sorts

Connect the glasses directly to an Apple Watch and then show the metrics as they would on your Apple Watch, but in the heads up display. The application would simply facilitate the connection but the Apple Watch and Apple data would be what populates

Directly connect to cycling or motorcycle computers (like the Beeline or Garmin and push into the HUD from there.

Connect glasses directly to an electric bicycle’s computer. In this way, the glasses can act like a secondary authentication key, as well as have the ability to exercise the heads up display for metrics or changing some high-level settings while riding.*

Connect to the Humane AI Pin replacing the laser display for the heads up notifications. But still keeping some of the gesture activities on the AI, maybe allowing the glasses if it has that gesture area for things like a yes/no-type selector.

Prescription lenses. Specifically having either replacements of the current lenses with prescription lenses that also could be offered as a transition lens. Or just replacing the existing sunglasses only lens with a transition lens, and then letting the prescription lens be the extra lenses used with the attachment (current design).

Connecting to Lumos or other connected helmets directly without going through an application on the mobile. Enabling a person to see things such as the battery life of that helmet, or do other high-level controls such as activating/deactivating, turn, signals, etc.

What is the Hold Up

A few things are preventing some of these activities from being created. The biggest one is just getting up to speed with the programming languages before the integration of the Maverick SDK. That’s just a personal hangup, not anything with Maverick or Apple. Some other challenges previously included time, and staying attentive to this long enough to actually fight through the ideas to figuring out exactly how to implement. This may be an area where partnering with another developer would come into play.

It’s very possible that BMW (and other companies building onto of the Everysight Maverick Developer Edition device) are iterating in the same mindset and looking to release their software with locked/core integrations to their dedicated hardware. This would make some of the ideas here very good but, again for lack of implementation, not really worth anything of value. The opportunity Shouldn’t be held up by a lack of knowledge (so to speak), nor by not-yet-existent market conditions.

glasses, wireless headphones, inside of a white case, and artificial intelligence lapel phone, all sitting on top of an ottoman

Sparks in Awareness

Sitting here, journaling about what I’ve done and haven’t done and see potential in the Eversight Maverick to be a solid node in a constellation of devices. There are types of interactions which should facilitate extending our perception to the world around us. Also being competent enough to get the beautiful layer of metrics that some people desire for making a “market“ out of this. There are other glasses that might be closer. But I think, especially with the focus on cycling and motorcycling, Everysight has the right idea for “awareness” augmentation - not just another beeping dot.

Crazy how such a reflection is sparked by new and similar glasses which have gotten some attention. If these make it to market, it would be solid. But, many of the comments of the Everysight Maverick Dev Edition hold the same for this.