See Things Differently with HumanWare
Have you ever wondered what it’s like to really see the world differently?
9 hours ago

S1E16 - Guiding the Way with Glide and JAWS for BrailleNote Evolve

Transcript

Hello and welcome to the human
where See Things Differently podcast.

Each month, join your hosts, Rachel Ramos
and David Woodbridge as we bring you engaging interviews from
guests, interaction from our one of a kind distribution partners stories
that will take you off the beaten path.

Current promos, shows
where you can find us and so much more.

So stick around.

Hey everybody, and welcome to episode 16 of See Things Differently.

The podcast
and this episode is truly a sweet 16 because we have not one,
but two special guests on here.

I am, of course, with David Woodbridge,
my trusty co-host and the first of two special guests

I'd love to
introduce is my counterpart, Joel Zimba, who is the other product specialist,
and we run around the country and he is on the podcast
for the first time, so it's very exciting.

So Joel,
welcome to our lovely little show.

Hey thanks, Rachel.

So yeah, it's my maiden voyage
on the Human Wear podcast.

I don't know how I made it to episode
16 without, without joining you,
but I'm sure I'll be on on again.

Probably,
probably much sooner than the next 16.

I have a
I am a, human wear product specialist.

I to, work with all the, the blindness tools we have, the Braille
things, the talking things.

And, Rachel and I occasionally cross paths
in our meandering around the country, in the world.

And we always have a great time
when that happens.

But we're usually in in different places.

So you don't look for us at an event
near you. Yes.

Very, very true.

So Joel is here
and it is also my pleasure to introduce our second special guest today,
our interviewee, Amos Miller, who is the founder and chief
executive officer of The Lighthouse.

Amos, and his team connected with us,
and we thought it was a wonderful collaboration
effort to have him on the podcast so that we can all learn about guidance
and and where

Amos got the inspiration
for such a neat tool.

And Joel is here, as he has preordered the device and has experienced it
through their demo days.

So it's going to be a great conversation
a little bit more about, you know, travel and guidance
and maybe in inventing and technology.

So, Amos, welcome and thank you so much
for coming to the podcast.

We're all thrilled to have you here.

Thank you. Rachel, it's, wonderful to be here.

And, Joel and David, yeah.

Excited
to have a conversation about technology, about how we can make, use technology
to, you know, make meaningful difference in, in our lives and especially as
it comes to independent mobility.

Amos, if you could give us a little bit
of background first about you,

Amos Miller, who are you
and what got you interested in technology?

And then we can now that will,
of course, lead into guidance and how the inspiration
for that came around.

Yeah. Okay.

Or very, briefly or as briefly as I can,
but, so, yeah.

Amos Miller,
the founder and CEO of guidance,

I am also blind myself.

Just, so to never sit with everyone.

I lost my sight,
back in my mid early mid 20s.

I have retinitis pigmentosa.

Pretty aggressive, version.

Lost my sort of night vision
in my teens and tunnel vision and then basically lost the rest of it
while I was in college.

I think I switched
from any kind of magnification to jaws, when I was, 27 or 8 or something at that.

I've also been a computer freak
from a very young age.

Started coding,
probably at the age of ten.

And, did a computer science degree.

And I've been in tech
for my entire career.

Before I started.

So I spent many years at Microsoft, in Microsoft Research
and at Microsoft in the UK.

And while I was at Microsoft,
I also that's that's really when I started to get interested in how we use technology
to improve independent mobility.

And one of the pieces of, technologies
that I developed while I was with Microsoft was soundscape,
which, David, we talked about before, which in my in my mind,
this has been a really important step into understanding
what is the role of technology.

And in our sense of independence
when we're out and about.

And, that's what led to the guidance.

Joe, I'm going to turn to you now and ask
what got you interested in guidance?

I know what made a huge splash when it first kind of entered everybody's
consciousness or awareness.

But tell us how you kind of came to know
about guidance and what really captured your attention about it.

Yeah. Good question.

So I have a good friend.

His name is Mike Bullis.

He also, also has pretty water glide.

But he's he was a mentor to me for, for many years,
probably since, maybe 2010 or so.

And one night
we were sitting around at a bar drinking some beers, and he says to me,
you know, we got to develop this tool that, you know,
it doesn't need to be an artificial dog.

It has to be this thing
where you kind of push it along and it doesn't need to steer itself,
you need to steer it.

And all it really needs to do
is have brakes.

And one of these days, we're going
to have enough technology that we can have sensors and GPS and enough, onboard intelligence that it's
going to be able to, to be useful.

And it'll just kind of apply the brakes on one way or the other,
and it'll steer you around.

And I said, man, that sounds great.

You know, this was like 2012.

And I said, yeah, let's let's, you know,
we got to build one.

He said, I don't think
we're there yet with the technology.

So I've always had this idea of a tool.

And I hate to steal your thunder,

Amos, but, you know,
these ideas have been percolating around.

And so when when someone started
telling me about this, this tool out there that a blind guy was making that was doing
kind of what I had been picturing for a while, I jumped right on it, you know,
I called up my buddy Mike and said, hey, someone's finally building this,
this dream tool that we had come up with.

And it sounds like
it's it's going to be great.

So as soon as I could,
I wanted to to get my hands on it.

And lo and behold, it was someone
who built an app that that I had used and really liked that the soundscape app
made me at rest in peace.

Although, although it has some,
some weird summations.

Yeah, yeah, yeah, yeah.

And more that more than one. Even.

So, it's kind of it's
it's being fruitful and multiplying.

Of course I now use voice. Mr..

All the time, which is kind of as close
to a, a clone of, of soundscape as you can get.
I think it's very, very similar.

And in my experience,
I think you can even open up the points of interest that you see from soundscapes.

So, so
and it did a lot of the kind of things that we think you need in an app.

You know, we have a GPS tool
that human where it's it's the only hardware blindness GPS tool
that I know of.

The, the stellar track,
the only one out there right now.

And it kind of replicates
a lot of those same features. It, it, it tries to bring it all together,
like what's going on as you're walking.

Your points of interest even describes the type of intersections
you're coming up to see how to sense, it does a lot of those features
that we think are important in blindness, navigation in a, in a hardware form
instead of just as, as an app.

So that's one reason I liked soundscape
is it brought a lot of those features
together in one place.

It is for the inspiration, for guidance.

We talk a lot about soundscape
and for soundscape.

Did any of that influence
how you developed or are developing
the software portion of glide?

That's a great question.

So soundscape was developed with this notion of enhanced
awareness, right.

The idea that's the more we have, that we are aware
of where things are in our surroundings.

And the more that we can receive that
information in a way that's not dominating or dominating our awareness, like,
so that we can keep a conversation with somebody or pay attention
to what's going on around us.

And so so that's been a lot of the work that, that we,
that the soundscape was about.

It's, it's
how do we expand or enhance our awareness so that we can make good navigation
decisions in that environment, improve our memory of locations,
orientation in spaces.

But it's always been very limited to what we could get from data.

What we could get from GPS localization,
which is usually limited to outdoor surroundings for the most part, inconsistent data sources.

So there's been shortcomings,
but on the whole,

I think we were able to demonstrate that that kind of capability really does, improve confidence,
when you're out and about.

And, and one of the things
I've always done with soundscape, I said,

I don't care about the micro navigation,
the micro navigation, the user will do.

Yeah.

I'll just give them the surroundings
and they'll do it in the details.

Well, it turns out that for a lot of
people, the details is the problem, right?

The last mile to a door, the or the last few feet, the, what's standing in front of a space and trying to decide which way to go
and navigate it without them?

Yeah.

Like if that's
kind of the micro navigation of the small navigation details.

And I did explore whether we could use
do do that with audio.

I did explore
whether we could do that with haptics.

I was involved in the project
that we did at Microsoft, with a controller, which was kind of a virtual cane idea
that was attached to the body.

And I think all of these sort of
started to scratch into that direction.

But what I felt is at the end of the day is is there a way to create a, an easier to use mobility aid, like an actual primary mobility aids?

And, and my hypothesis
was that we need something that is physically connected to the ground
and physically guide.

So I think that we can do a lot
with information.

But for primary mobility,

I think we need something that is
physically connected to the ground.

And physically guides.

And interestingly, Joe,
I mean, you had a similar intuition back in 2012.

I think, and, and when I started to test
that intuition, that's really, what led to the idea of glides?

I think we've seen some,
you know, some similar.

There's the guiding suitcase.

I think a lot of us would walk barefoot
and wish that the suitcase would guide us.

Right.
Because it's so easy for it to guide us.

Right? Right.

And there have been projects in academia
that tried that, but so, so in many ways,

Rachel, kind of connecting
back to your question, it's the gap that that soundscape left open that was the inspiration for glides.

And where I think the opportunity is
actually to bring all of these together.

I, my wife always says,
if I had soundscape and, and glide together,
I would not be worried about going blind.

It sounds like you're trying to take the best of the information
that soundscape provided.

The best that a physical guide
can provide, and meld those together
into a complete unit.

And that kind of leads me into, another question,
the hardware aspect to glide.

Obviously, you guys and your team are working
on the software and incorporating that, but where did the hardware design
come from?

Yeah.

No, I mean, the
the hardware is still the same idea of, of two wheels with a long handle that you walk, you hold the handle
a little bit like a guide dog.

It's it's actually more in front
of your leg rather than on the sides.

And you work with a device,
and as you walk, the wheels steer the way.

So the foundation is, is still very much
the same.

Very much, this, this concept of just follow the device and, you know,
follow where it's follow where it's going, and it'll guide you gently around
an obstacle keeping you on the safe path a lot more focus
on maintaining a line of travel, maintaining clearance like the dog will do
to make sure that there is a the that that you don't,
scrape your right shoulder.

And all of those capabilities are built into, into the device and that's in many ways, when I think about a device
that can physically guide, and that is physically connected
to the ground, at the ground, providing a separate point of reference, you know,
what what are the options, right?

It's either wheels or legs or, or something along those lines.

Right.

I don't believe that legs, in terms of robots with legs
are ready for prime time right now.

I mean, one of the things
that was very important for me, developing glide is making it
a viable product, right?

Something that is foldable, something that you can pick up
and put in an Uber or something that you can very well walk into a bus and put in the overhead
compartment or under the seat.

I just don't think
that the legged solutions are there yet.

They will be here in a few years time,
and we can probably start to look at them as, as solutions. But, the we have the solution.

And interestingly,
a lot of the legged solutions are slowly moving to wheeled solutions as well,
because it's just so much simpler.

So that's, that's where we, still with a weird solution.

That does mean, Rachel, that, it, it it can guide you to the steps.

It can guide you to escalators.

It will put the brakes on. And that, but you will, on the escalators,
you can walk well onto the escalators
and just stand there.

If you are on steps, you can either wheel it down
or you can pick the device up and, hold the rail and walk with it down
so it doesn't climb, steps.

Yeah.

One, one question
that I've always got about glide, which,

I mean, that's sort of long, what,
quite a long time ago, but I guess because

I travel all the time in Australia,
how does the glide go with?

Because one thing that I've always
been bothered about is railway stations.

Railway stations
petrified me because about ten years ago,

I actually fell off a railway station
and broke two of my ribs, which is another story,
but how does it go with,

I think at some stage
someone is talking about like the, the cliff idea that the guide will not do
so the clients will actually notify you of a drop off, whether it be a curb or,
railway platform and that sort of stuff.

How's that development going so far?

So if, if I may, it doesn't
just notify you, it bloody stops.

I mean, you're not going to walk off
the edge of something unless you're unless you're not behind it.

If you're behind the glide and you're walking it the first time, it's
it brought me to a dead stop.

I thought it hit something.

But it was just stop it.

It just.

It was like, not going anywhere
the same way a dog would really.

You mean it?

They stop, right?

You're not going to get them to do that.

And that's
that's my feedback from having used it.

It stops that or steers you away from from the like from the yes,
that's right from the drop off. So.

Right. David, we have, obviously drop offs in my mind.

The highest, risk area.

I work with my team, but the only partly
that, you know, walking into a wall hurts.

But it's okay.
Nothing will happen. Correct.

I agree, yeah, yeah, but walking
off of an edge of a platform or a curb, that's a lot, a lot more problem,
a lot more problematic.

So we have a dedicated camera
that's facing downwards.

It's a depth camera and it's primary responsibility is to detect drop offs.

It has actually three, three cameras that, are working together to create a stereo image
in different lighting conditions.

So it's a very sophisticated camera.

And that's what enables us
to see, a drop off from quite a few meters ahead and also on
the sides of the, of the device, and we treat, there's a something in robotics called
Lethal Obstacles, which is not sounds, but it's basically
the highest order of obstacles.

The robot will do everything
to keep you away from.

You won't compromise on them.

They're basically marked
as lethal obstacles.

And basically the experience
is that if you're walking on the platform is on your left, on your right, it will keep you
well away from the platform.

If it's waving you around an obstacle,
it will not choose the side that the platform is
on unless there is a.

Well, plenty of clearance.

And if.

So. And like Joel said, if you're walking towards the platform and it does,
it can't steer you on to a safe path.

It will just put the brakes on.

Is it giving you any feedback
about the environment?

So I don't know, happy vibration, audio
feedback, that sort of stuff.

The main feedback is the steering
and the braking.

Right.

And what you feel through the handle.

But that's, it's quite interesting how much you can use
those pieces of feedback alone.

So for example, if you're walking along a corridor, you can angle the device
a little bit towards one of the walls.

And then you can almost feel as though you're sure lining that wall
from a meter away.

Right.

So you can get information
about your surroundings.

You can also feel the changes of
texture through the handle.

But we do also have the the cameras are pretty,
pretty sophisticated.

And the AI that runs on the device
and in the cloud, gives us a lot of, capability to provide additional information.

At the moment,
it's can provide you information about the obstacles
or the objects that it can see.

We we
we think about objects as 2 to 2 types.

One, what we call line of sight
targets of these, object objects in the environment
that you would want to walk towards.

Like, like escalators, steps,
elevators, counters, dropped curves up curbs,
all of those things that,

I often described them as an obstacle
that is actually a path.

Because you choose,
you choose the elevator as a destination.

If you didn't choose it as a destination,
it would, avoid it, right?

It would keep you away from it.

But think of it as a wall. So, and and landmarks, what we are, working on,
which you would not be surprised, is some of the more open
ended, information about an environment like saying, hey, glide,
you know what's going on here and getting a bit of a description, of,
like, what you would get from the meta Ray-Bans
and things of that nature.

So the interesting thing is that once
we have the guidance platform and the cameras in place,
we can now start to use those more open ended AI capabilities and relate them to, the the guidance.

Is there any way
that you can get glide to.

But I don't know, make sure that you're stepping into the carriage
not or not between to, cars, for example, when you're getting
into a train, that's the type of scenario.

So first of all, I do think that it's useful to have a cane.

In the other hand, sometimes, because glide and glide
doesn't care about the cane.

So the positioning of the cameras
and everything you can poke around and get to check things out and it's it doesn't get scared or anything.

We are actually planning
to add an accessory so that you can put a telescopic cane
attached to the stem that's guide so that you can, pull it out
when you when you do want to move around.

I think that would be just very useful
in the exactly those kind of situations.

You just described.

How do you compensate
for the fact that the eye system may be detecting something
that's actually not really there?

Well, that's not really an issue,
actually being developed in the product.

It's a good question.

So we have two sets of models, ones that are more open
ended, like visual language models that,

I would say more prone to hallucination.

But those are used for more open like, find me the reception desk
and they sit right.

Took you to the bar instead. Dear me.

Then we have the models that we trained
with run on the device with the very specialized models, for for example, the main model that we, that we use is for segmentation,
which determines what's a walkable space relative
like if you are on a sidewalk and there's grass on the side,
you don't want to go onto the grass.

Yeah.

Like even though they're both flat. Yeah.

And that model, we,
we train, it with our data.

We can test the performance of it.
We know what it is.

We also know when it's not when when
it's like it doesn't hallucinate so much.

It provides
probabilities about its accuracy.

Okay. Yep. Yep.

Yeah.

So it doesn't have the same kind
of hallucination issues that, the generative models have.

So we can tell you, hey, at this stage, I am not confident
with the information I'm giving you.

Yeah.

So at least that, at least you,
you know, not to trust it.

Okay.

We also combine
multiple models at the same time to compare and see
if we get the same results from both.

So that increases our confidence
in what we are getting.

We do have models in the wheels
for braking.

So we use the motors for brakes and speed control.

And we can also we also give
a little bit of a pull forward.

But it's only in, in as like power assist,
I would call it, rather than just dragging you around,
which is something that we avoid.

I actually think people have had different
views about that.

I don't know if Gerald or Rachel
wanted to so mention it, but I think on the whole, people
actually appreciate the fact that they this
they are always in full control of the speeds
and what their device is doing.

So I, I personally love that
I, I like that I am kind of pushing it.

I know a few people who have said it's
hard to get used to because they're used to that dynamic pool
that you establish with adaptive, and then it's really hard to get used to
the idea that you are pushing something and then getting that feedback,
rather than kind of countering the pool.

So it is a different
that's a different kinesthetic model, if you will, of of what you're doing, which again, is altogether different
from what you're doing with the cane.

You know, that's
that's a whole different kettle of fish.

And Amos, one question I had is about the,
the cameras in the eye.

So will this device be internet dependent?

Will someone need a data plan
to access it, or how are you interfacing with the models
when somebody is out and about.

So we do have a cellular modem
on the device.

So it it comes with a data plan.

It's not going to be using your data plan.

Basically of the subscription
is paying for the data.

Yeah.

And it's so the device is not dependent on on the cloud for everything.

So it, it works offline on it, like its core guidance capabilities. And, using the,
the on board models for detection of, objects and things, it
it's functioning it but, but for the more advanced AI capabilities
and putting on maps and so on, it will
it will connect using its own data plan.

Okay.

And then if I wanted to download
an offline map, or maybe I wanted to take it
hiking or something through a trail, could I do that
and then have maybe the the basic

AI capability is still available,
or is it kind of best to be used with access to the, cellular network?

Great.

So the honest answer is
we haven't developed that yet, but we have all the capabilities
to enable that.

So I'm as we've said, it's it's a it's basically a telescopic
or a handle on wheels.

But what what does it actually feel like.

So if I came across glide I had to explain
to somebody I mean, yes, I can say it's a handle and it's got two wheels,
but what's its physical like?

How does it feel
like what the device feels like.

And you know how big the wheels are, how big the main chassis
is, all that sort of stuff.

So the wheels have gotten bigger.

In my experience.

The wheels have gotten bigger.

But people, the feedback I often get it is
people are surprised that it's smaller than they
what they thought it was going to be.

Right? Right.

Yeah. It's not small.

It's not a tiny device.

The wheels are,
currently 7.5in in diameter.

About, eight inches apart.

Right? Yeah.

So it's not tiny, but, on the whole, it's not a hefty thing.

It's quite,

Yeah, it's it has two wheels.

It's the stem is about an inch
in diameter.

So it's, you know,
it's a bit like a scooter, if you like.

It feels a bit like,
holding the front of a of a scooter.

Yeah. With the wheel.

And then the handle is a horizontal grip.

Like a guide dog harness.

And it is, it has buttons in the handle.

We've actually repositioned
behind the buttons so that you use the thumbs
for left and right, and you use that sort of grip
for confirmation.

Yep. And as you walk it, you kind of feel where the device is pulling you
or not.

It's not pulling you.

But but gate guide,
traveling, steering you.

You feel
you feel the handle turning in your hand.

You feel the terrain through the handle.

You feel, and, you know,
if it's on an incline or decline, you can just, you know,
just feel the pressure in the handle.

It's kind of.
It's pretty. It's very tactile.

I don't think that glide needs to be.

Its be all and end all.

I think it's a really good idea
and it should connect and work with other tools
that we use in our daily life.

I'm thinking maybe good
maps would be a good, good interface.

Electric, you know, like it could connect with rubber.

Like you can think of all kinds
of interfaces, like, smart glasses, like I again, I, I really

I think of gliders and mobility
aids as an interface, as a device.

Gives, gives you, gives the AI or other tools
a way to communicate with the user in a way that I think
is going to be very intuitive.

So I must, as it connect
to apps running on your smartphone.

Like I'm just I imagine the scenario
where you've just booked Uber and then you go and go like,
so you know what?

Yeah. Because just pulled up, I'm going to actually take you
to where the car is parked.

Yeah. I mean, it's, entirely possible.

I mean, we do
we do connect to the smartphone.

We have a companion
app on the, on the phone that is used purely
for management of the device.

Right.

But we can work on interfaces, to enable those kind of integrations.

Absolutely. Where can folks find you?

I know that you have really been
very involved with the glide community, whether someone's a founder
or whether they've, you know, preordered.

Yeah.

Well, I love that you're
you're so communicative.

And I'm sure your users do as well.

So where can can people find you
if they want to learn more or follow what glide is up to? Yeah. Thank you.

The best
the best place really is glide and dot io.

So our website glide and.io.

If you just register in first name,
last name and email address, you'll then be invited to our various engagement
opportunities, which include, we do monthly zoom calls, which are really the highlights for the,
for the whole team.

Like, the whole team goes on these,
monthly zoom calls and, they're always very rich
with engagement and information.

And now that people are starting to use
the device, we're getting real, real life experiences shared and
and so on.

We also do, demo days and conferences.

So, upcoming conferences are the A and B and B conference,
and the ACB conference is in July.

And from that time frame we will start to, to do demo days in other other parts of the
of of the country, the world, step by step,
we will get to Australia, I promise.

David, just.

Oh, thank you, Jackie,
when I'm hanging there for it.

Yeah.

No, I'd love to come and at least do a, a few, demo days in, in Australia.

Regardless, I think, it would be amazing.

Yeah.

So so those really are the ways, Rachel,
for us, for folks to engage.

And. Yeah, it's it's it's always been very important to me.

To, to develop tried out in the open with all the,
I would say with warts and all.

It mean it does mean that people. So, versions of the device
that, you know, still in development and had some problems,
a lot of those are being, being cleaned up, and I'm, I'm excited
for where we are. And, and I've been enjoying doing it
with everyone.

I learned so much.

I think we're learning together so much.

So many of these questions
that we have to deal with don't have clear, obvious answers
with. No.

Yeah.

We are discovering together and that's,
that's been an incredible journey.

Amos, thank you so much for coming on.

We really appreciate you and the guidance team
and the updates you are able to provide.

And I'm really, really excited to see
where glide is going to go in the future.

Hopefully I can sneak in a demo at,
the summer shows and, be great to kind of try it and see where it's
where it's been, where it's going.

And I know you've got us all very,
very thrilled for for what's coming in the future.

As Amos says, onward,
onwards, onwards and upwards.

Exactly.

Hey, everybody, and welcome to the Hot Topic segment for episode 16.

We had a great discussion with Amos Miller
about guidance and travel and orientation and mobility, and that made us start thinking what would be a good thing to discuss that
folks may have a different opinion on.

And this time
we want to talk about wearables, specifically what wearables
would we choose to use, whether it's an Apple Watch
or the Ray-Ban meta glasses.

Maybe it's another external device that you enjoy taking along with you.

But I'll jump in and start us off by saying that I found that really helps.

My travel is the Ray-Ban meta.

They are coming along
in terms of applications being integrated with them.

So I know that IRA is testing integration, and I can just
see that being really useful.

I was able to jump into the the beta of that
and just walking through an airport and not having to try and hold my phone up while I'm carrying another bag.

I could just point my head at wherever
I needed them to look.

And I found that very, very simple.

So Ray-Ban meta I find are really great
kind of travel companion.

The discussion of the Apple Watch came up,
and I'm going to turn to Joel now and say, Joe,
you have some very interesting to tell us all about the Apple Watch.

You work for us.

Yeah, yeah.

So I,
I have to admit that I it's been around for ten years or so now, and I do not use an Apple Watch.

Someone gave me one a few years ago.

It's a version three.

And I tried it out and I.

I wasn't enthralled by it.

And, and,
and I think I have a number of reasons.

Why do you want do you want my list to go?

So this is the hot topic.

So so here's my list as a blind guy,
if I'm using my phone, I hold it in one hand and then I like unlock it
and I swipe and I tap.

And I was thinking that
that wouldn't be the case.

But when I'm using my Apple Watch,
both my hands still get tied up
because it's on my wrist.

So I have to kind of hold that hand still, and then I have to
to operate it with my other hand.

So it's not really buying me
the advantage that I think sighted people get with the Apple Watch, where they can
kind of glance at their notifications and and get that
information from their wrist.

So so it's, it doesn't
seem to increase convenience for me.

Number two is that the thing is huge.

It stands away from my arm
and I keep hitting it on everything.

And I've always been a watch wearer.

You know, a Braille watch.

But but watches have a much lower profile
than this big square box, that you wear on your on your arm.

It's not at all sleek and ergonomic.

It's this big box looking thing, which some people are into, I guess, but it's also kind of it's to me,
it's just big.

And then I would expect with a watch that,
you know, I have to try it.

I have to charge it all the time.

Now, some people don't wear it
to bed, but, you know, I kind of thought, oh, well, you know,
the sleep tracking is kind of nice.

But then when I do wear it to bed, sometimes in the middle of the night,
it'll just start yelling and wake me up.

So I have to turn off voiceover
to go to sleep, which defeats the purpose of
of having a wearable.

So I don't know, at every corner
or at every.

But yeah, I was just constantly
being frustrated by using an Apple Watch.

I couldn't find an elegant, smooth way to fit it into my life
that really improved anything.

That's a very comprehensive list,
and there are a lot of really good points.

I think that you make, honestly, the the time that I wear my Apple Watch
the most is when I'm doing a workout.

It's because it has little haptics
and it'll say, hey, you know, you've you've passed this
this much in your workout.

And so I find it useful for that.

Just kind of keep me on track.

Can you tell me more about that?

Limits it. Do like
does it count your reps?

It depends on if you're using it.

I use it a lot with Apple Fitness Plus,
and you can quickly kind of glance it, you know, your heart rate
or how many steps you've done or what's your distance that you've ran.

If you're doing any, running workout.

And can you do that without stopping that?

I am not that good.

Okay, okay. Because it is. You know it.

You're you're
you're kind of jerking your arm a little bit, so, you know, there's some
I guess, coordination to do that.

But that's kind of what I find it
useful for.

But other than that,
I don't wear it too much myself.

David has a whole different take on this.

All right. So, a couple of things.

So I use my Apple Watch all the time.

I actually do wear to bed.

I've got Do Not disturb on
so it doesn't harass me in the middle of the night at all,
but I use it, to, you know, check all my sleep
stats, and I have my breathing is if I've been restless through the night and not moving around
all that sort of stuff.

So that's really cool when I like,
because the watch charge is very fast.

So at nighttime, you know, I sit down,
have dinner and blah, blah, blah.

So I just,
I just get a little bit of a discount, put my watch on charge,
leave it for an hour or so.

By the time I get back into the room,
pick it up.

It's actually fully charged again.

So that that never worries me, because
I've also got to charge my phone as well.

And I've and I've got one of those stains
where, you know, I put my AirPods on
and one little charging bit, my, Apple Watch on the other bit
and my phone on the other bit.

So it's just a bit of a routine.

That's how you charge
all five of your Apple Watches.

That's right.

So I, I can tell you I had to buy a wall.

I didn't have to buy a wall, I did, I was
I got size to talking back like that.

That's fine.

Oh well look, you know,
they're always talking about walls in in clients complaining
it's Apple Watch charging at the same.

House.

Right.

Yeah. Exactly.

So my wearable of choice.

You know, Rachel,
the metal glasses are cool.

And I have been enjoying mine a lot.

And they've gotten dramatically better
in the last several months.

The descriptions, the the AI, all that.

But my wearable of choice is the AirPods.

I use my AirPods all the time.

You know, sometimes, sometimes I'll ask Siri
to read me my messages or my recent email, or I'll reply and you can nod your head
and it'll read you something.

You can have it in that transparency mode
where you can hear it, what's going on around you,
or if you want to drown things out, you can just kind of turn on the noise
canceling, like on an airplane or whatever.

I use my AirPods and I was late to the
I was late to the party with those two.

By the way, I didn't get AirPods
until the AirPods Pro two came out.

They were the first one I used
and I figured I'll give them a few.

I'll give them a few years
to work out the kinks.

And that's my position
with the Apple Watch.

You know, it's been around
ten years. Maybe I should give it a try.

You know,

I, I never thought I'd be late to the game
with technology, but I'm getting to be that way with these things
that are going to be attached to my body.

Yeah, I got I got the iPhone
when it first came out, but I've been

I've been slow to the wearable
to the wearable world for a bit.

Yeah.

I mean my I mean, besides the Apple Watch
and the Ray-Bans and the AirPods,
which of course I have all of them.

My other favorite one, and I don't know,
this one's well known in the States because every time I bring it up, people
go, what are you talking about?

It's called the the mini guide.

So it's a little handheld device
that vibrates.

It's, you know, it's a solar type system,
so it sends out pulses and it comes back and depending on have,
the engine in the unit vibrates how close the object
is, you can switch the the beam the guys have between 2.5m up
to four meters away, which I think in, in thinking us terms is round about 16 to 18ft.

And that's for more
for scanning your environment.

Like if you're in a park
and you want to find out with the swings of your children, for example.

So that works out really nicely.

And you can, you know,
I can plug in a pair of headphones and listen to the audio
rather than the vibrating, but that's just one little device
that it's normally in my pocket.

If I want to double check a wall and
find out where there's a gap in the wall.

Oh, just, you know,
how did my head point at the board?

And when it's because it stops vibrating,
that's the gap.

So that's the other one
that's always in my toolkit.

It's the is the mini guide.

Well, this was a great discussion
on wearables and why we choose them.

And, I'm I'm inspired.

I might have to take a look at the new the new Apple
Watch along with Joel in the fall. Yeah.

And if we do this, if we do this
next year, there will be more wearables to talk about
because there is that, you know, certainly will be the year of the
wearable.

It will be. Yes.

Hey everybody and welcome to Partner Corner for episode 16.

It is my pleasure to be with Matt Ader
from The Sparrow.

Here's
the senior vice president at for Sparrow.

And he is joining us today
to discuss the partnership that we have developed at Human where with for sparrow to have jazz on the braille note.

And before we have Matt
come and say a few words, this partnership is very exciting
for us here at human, where it aligns with our mission
to empower blind and visually impaired individuals to live life on their terms
and to be independent.

And we know that a lot of independence
begins with computers
and developing those computer skills and then transferring
to use of a screen reader such as Jaws.

And what we are able
to offer with Jaws for Braille.

Note is a six month complimentary license with each Braille note evolve, and then the possibility to purchase
either a home or pro license, from human wear
or from our local distributors.

And this is Jaws for Braille note
and it will be limited to just one device
that being your Braille note.

And you can buy that at a price
that is yet to be determined, but will be discounted
from the regular Jaws version.

Though
not a different Jaws version at all.

It will be the same as the regular jaws that you can get for a windows computer.

So with all of that said, Matt,
welcome to the podcast.

We are just thrilled
that you're here with us.

Thanks, Rachel.

Really looking forward to our discussion
and thrilled about the opportunity for, you know, Braille note
evolve customers to be able to have access to jaws on the Braille note,
which I think is, as you were saying, key to people, you know, as they grow through the use of technology
and the access to a PC is kind of like, the real benefit here because, you know, we know, as a person who's used the Braille note in the past and, also, you know, worked for human
where years ago, you know, recognize the importance of, of, you know, early entry into technology,
you know, sometimes begins with, you know, devices like the Braille note
and being able to, collaborate with human work
so that as people evolve in their, technology needs and move on
to using more advanced tech that they get quick access to,
you know, jaws, through the platform.

So I'm really excited about it, as are we.

Tell us a little bit more, Matt,
about how this partnership came to be.

You gave us a little bit of your history, you know, working at Human Wear and having
used a Braille node in the past.

But what prompted the discussions?

I mean, we know that the Jaws runs on
windows and the Braille node is windows.

What's that?

Just kind of a match to good to
to pass up or what?

What were your thoughts,
as you began these discussions?

Yeah,
I think it benefits both companies. Right.

So, from our perspective, you know, people who are going through school, we want to make sure
that they have access to our software.

You know, currently we we, you know,
we do that different ways.

You know, it may be sold direct
into schools, it may be sold through, you know, other partnerships
like American Printing House.

But the, you know, the
the key is that the software also needed to be able to be ready for kids
using different kinds of technology.

And then we look at,
you know, vocational rehabilitation.

And they're buying braille notes
to get people, you know, skills and maybe they feel more comfortable on a,

Braille note, device
like the Braille note evolve.

And at times they may start
moving into using computing.

And if we if we sat back and said,
well, this just doesn't work.

Then, you know, we have a gap here.

So it made sense for, for us to sit down.

You know, I met with the team
both internally and, and, we even took a visit to,
the Houma offices in outside of London and, sat down and kind of worked out
how we could partner, you know, it's, you know, in today's world, you know, we just
because we may make competitive products, it doesn't mean we can't work together
on certain solutions.

And I think that's where we,
you know, found, the opportunity.

I love what you said in today's world
that, you know, we have competitive products, but it doesn't mean
we can't work on various solutions.

We found that with the Monarch and Jaws,
and it was just wonderful to partner with you folks at the Sparrow to produce the webinar
and show how monarch can work with jaws.

And I think same with the Braille note.

Whether you know someone who's been the
Jaws user like a long time, like you or I, or maybe they're more comfortable
with, Nvidia, the fact is that

Jaws remains a very customizable
and powerful screen reader, and having that option available on the Braille note,
puts it more in reach.

I think of so many.

So it's definitely a benefit to,
both of us and both of our missions.

Tell me a little bit
more about your role at the Sparrow.

I mean, I know you mentioned you were able to be
with the team in London, but what is it?

Maybe that brought you to London and tell
us more a little bit about what you do and how you were able to,
kind of have a hand in this partnership.

So I've been with the
company about 12 years and, you know, my current role
is, you know, called business development.

And basically, I look, I work across
all of our segments of our business to kind of figure out,
how do we help, work on partnerships, how do we work with our clients,
how do we resolve big issues, how do we come out with solutions
that work for the clients?

And so think of it as somebody who's,
you know, on the street, talking to the customers
and talking to the clients and developing, you know, solutions that work
well for everybody.

So, it's very exciting for me just because it's, it's a, you know, my favorite thing is, is actually working
with customers and clients and partners.

And so, you know, you know, it could be anything from,
you know, the work we do in, in self-service and kiosks
and things like that.

Or it could be, you know, the work we do with our customers
who by zoom text or fusion.

And so,
you know, an example was in years ago, we first met you and I during during
the fun pandemic times.

And, you know, one of the,
one of the solutions that that, you know,

I found that we had to develop at
some point was the, split audio feature within Jaws.

And, you know, it was so key
during the pandemic times when we're, on social, social media, but we're also in platforms
like Zoom and Teams and whatever other 50 versions of, of, meeting software we were on.

And, you know, I'm sitting back going, we have to come up with a solution
that solve this problem.

And, and that's
just by being in the community.

And so that's the fun part about this job
that I'm in is it's very community face focused. It's very customer focused.

And you know, basically all
listen to the customers and bring back, you know, solutions that, you know, the customers may need
the fact that you're able to, to be a partner with so many agencies
and organizations,

I'm very happy that you're able to partner
with us and put jaws on evolve and, that doesn't just mean, you know, the jaws
version, right?

I know maybe you don't do
scripting yourself, but that also could involve
scripting, right?

So the screen reader has no different functionality
from working with it on a windows PC.

Would you say that's accurate?

Yeah.

The only the only difference is it's
licensed to the piece of hardware.

So it's a very specific version.

It's a single,
single install for that, device.

It doesn't stop somebody from bringing
an existing copy of software,

Jaws to the device if they wanted to,
if they already have a license.

You know, but it's it's designed so that, hey, it's, very specific
to from a licensing perspective, specific to the braille note evolved,
but the features within it are all the same features.

So if somebody wanted to script and, make it work with a specific application, they can do that if they, all of the, features of jaws,
whether it's the split audio or if it's, even third party
scripts could still run on the device.

So if somebody wants to use scripts
that are made by somebody else, they can deploy those to the machine,
just like, but anybody else can.

The, you know, I features will exist.

So if you're using picture smart AI, if you're using the new Page
Explorer in Jaws, if you're using, it
if you hook a camera up to the Braille note and use face and view,
then it can do that as well.

So it can use any of those,
any of those components or features.

And if you haven't tried the
The Sparrow Companion, which, you know, recently got rebranded from FS companion
to the Sparrow Companion, it's where you can get all your answers.

Question on how to use jaws.

You know, with

Windows office, Google Docs, whatever.

You just run.
You know, you just basically go in.

And one of the cool things I love about
the Braille note evolved from previous, you know, Braille devices is having some of the keys that are on it, reduces the amount of of combo keys and sequencing keys that we used to do
or still do on Braille displays today.

And I'll give you guys a lot of credit
on that, because that's like a a massive change, right?

To have an alt
key, a control key and insert key, you know,
all of these keys are function key.

You know there there's some sequencing
when it comes to the functioning, you know, hitting the function. Yes.

And you can explain that
better than I can.

But but the fact that I could do, you know, all control s or whatever the keystroke may be,
is, is really powerful.

You mentioned the
the braille display piece.

When you're connecting,
you know, a traditional braille display from any vendor
to a computer to use it with jaws, there are some commands, selecting
text comes to mind that can be a bit of sequencing
and a bit of a bit of a challenge.

So yeah, the fact that we do have control
and shift and arrows just makes navigating and working
in office, with jaws or on any, application like a web app.

So much easier.

I find that to be the case as well.

And lastly,
I will say that the there is no need to, you know, turn on jaws
and then search for Braille display.

Because the brilliant evolve
is a fully integrated device.

You turn on jaws
and your Braille is immediately there.

And then you can go in Settings Center and configure
how you would like it to display.

But that also takes a huge piece out of, getting all kind of set up and running with Jaws and Braille if you want
that kind of full suite of accessibility.

And I think that's very helpful as well,
you know, and I'll share with people because a, you know, setting center may be intimidating for some people
because it's a lot of different settings in there.

And you can go through and pick Braille
and stuff like that, and different braille settings.

But if you just run the startup
wizard from the help menu, so it's alt H
and then Z for Startup Wizard.

The first four screens are the 4
or 5 screens are things about, you know, setting your voice
rate and keyboard echo and, you know, how you want to use forms
and the other things like that.

But the last two screens are Braille
related.

So you can go through and choose do you
how you you want grade two.

You want

UEB you want input output to be the same.

You want Braille flash messages
to stay up long or short or whatever.

All those that you know, the bulk of the settings
are sitting in the last two screens of the startup wizard.

And so it's probably faster
to go through that experience than it is to if you're not as comfortable
with Settings Center.

And so I'd highly recommend people
first try that to see if it solves all your settings you're looking for
before you start tackling.

A little bit more complex environment.

I love the as you've developed jaws,
you know, made it more approachable no matter what
someone is coming to it from, whether they are new to jaws
or whether they're an experienced user.

Like it has features
that can help everyone.

And if you're a teacher, the visually impaired or,
an instructor out there using the
The Sparrow Companion is really easy because it's you could do it from in jaws,
insert space and then F1.

But if you're not comfortable with that,
or maybe you're on a device that doesn't have jaws like your your iPhone
and you want to look something up, you can just go bookmark
that on your iPhone.

And that way you can easily
search questions from a third device.

So you it's a regular website.

You just go there and ask it questions
and you know where you go.

We so appreciate a Matt coming on today
and telling us a little bit more about the partnership
and some of the assistance that is available in jaws, to get up
and running as quickly as possible.

So, Matt, thank you for coming on
and thank you so much for what your team is doing to help our team,
as we kind of launch the Braille note, evolve and work as efficiently
and effectively as as we all can with it.

So thank you so much.

Thank you, Rachel, and thank you to human
where for bringing out a windows device.

We're very excited about it.

Hey, everybody,
it's time for a special message.

Human where is proud to announce
the launch of the 2026 edition of the Jim Halliday
Steam Innovation Bursary, a global program recognizing individuals
who combine skill, empathy and purpose.

People who create, design or advocate
for meaningful change.

The bursary seeks students
who ask bold questions such as how could this be made better
for everyone?

It celebrates projects that merge
disciplines and challenge conventions.

Projects can take many forms
visual, tactile, auditory, or conceptual as long as they embody
curiosity, compassion, and the drive to make accessibility
part of everyday innovation.

Eligibility is open to students ages
13 to 25 years old who have developed
or are developing steam related projects that improve accessibility
or benefit their communities.

Applicants may apply directly
or be nominated by a teacher mentor, or community leader.

The program is global
with a focus on accessibility, inclusion, and assistive technologies.

Blind and low vision
students are strongly encouraged to apply, and Evaluation
Committee will review applications based on innovation impact alignment
with human words, values, communication, quality
and community engagement.

Now if you are chosen,
what exciting things will you win?

Recipients will receive
human wears, selection of cutting edge

Braille and low vision solutions.

These include a brilliant B40
x, a mantis Q 40 and a prodigy Complete kit
with a document camera and surface tablet.

The recipient stories will be shared
across human workers platforms
to inspire others to see technology not only as a tool,
but as a force for equality and inclusion.

So keep an eye out on the social media
channels and the human wear news pages.

For more information about this program.

Hey everybody,
it's time for the listener line.

We have a listener who has written in
and wanted to share a story, and we are all too happy to do so.

So Deborah Armstrong wrote to us,
and she works as an alternate media
specialist at a community college.

She has a brilliant at work
and a monarch at home.

Through the Rise program offered by a pH.

Both devices assist her in proofreading
classroom handouts as she scans them and OCR,
or converts them to text for her students.

She says, one day
I was distressed to discover that I thought some of the dots on
the brilliant were sticking, and I had been ever so careful to always
touch the device with clean hands.

The next day, when I wasn't so tired,

I realized the dots weren't
sticking at all.

I'd inadvertently toggled
on the Spanish profile, and the Spanish letters of the alphabet
often have extra dots in them.

That's why the Braille looked so weird.

She then gives us a keystroke.

Enter plus L or C4 or Command key four on the brilliant toggles
between language profiles.

She says so
if you set one up for another language, try that keystroke
before you think dots are sticking.

Another story she shares with us.

She says, I had a similar problem
once in windows.

I was trying to install
one of the Microsoft Spanish Voices so I could read some Spanish material
with Nvda.

Unfortunately,
I changed the display language, so everything shown on screen by windows
suddenly appeared in Spanish.

I was able to switch my display language
back to English and install

Spanish voices, but it required
reading the settings dialogs carefully.

It's very important
if you mess something up to stop, think, read carefully,
and if you were stressed, walk away from the technology for a day until you can think about the problem
logically.

Deborah, that's wonderful advice
that sometimes, if we're in some technology trials
and we just take a small break, sometimes it'll help us
think about the problem differently, and we can come back and triumph
over those technological gremlins that sometimes like to come out to play.

If you would like to be like Deborah and
write in, we would love to hear from you.

You can do so to podcast
at human where.com.

That's podcast at human where.com.

And yes we do read them
and very much enjoy hearing from you.

Hey everybody it's time for the upcoming shows segment.

You can find us in the following
locations.

April 15th through 17th.

We will be at the Pennsylvania Delaware
ADR Association of Education and Rehabilitation
Professionals conference in Harrisburg,
Pennsylvania from April 16th to 18.

We will be at the California Transcribers
and Educators of the Blind and Visually

Impaired, or CTE, BVI conference in Los Angeles,
April 16th and 18th as well.

We will be joining Sensor Tech at Zozo in
the Netherlands, and that is in Utrecht.

Finally, April 23rd through 26th,
we will be at the NFB state convention in Utah
and that is located in Ogden.

So if you happen to be in any of those
places, do stop by and say hello.

If you have comments or suggestions,
we'd love to hear from you.

Please send them to podcast
at Human Wired.com.

That's podcast at Human wear.com.

Thanks so much for listening to see things
differently.

We'll see you next month.

Welcome to a "Sweet 16" episode of "See Things Differently with HumanWare" ! This month, hosts Rachel Ramos and David Woodbridge are joined by HumanWare Product Specialist Joel Zimba to dive deep into the future of independent mobility and the expansion of the BrailleNote ecosystem.

Segment 1: Glidance and the Evolution of Mobility

Amos Miller, Founder and CEO of Glidance (https://glidance.io) joins the show to discuss Glide, a groundbreaking self-steering mobility aid.

  • Inspiration: Amos shares how his work on Microsoft Soundscape led to the realization that many users need "micro-navigation" and physical guidance for the "last mile" to a door or around obstacles
  • The Hardware: Glide is a wheeled device with a handle similar to a guide dog harness[cite: 101, 102]. [cite_start]It features a sophisticated downward-facing depth camera system to detect "lethal obstacles" like drop-offs and platform edges, bringing the user to a dead stop if necessary
  • Tech Specs: The device includes a cellular modem with its own data plan, ensuring it can access advanced AI and mapping features while maintaining core guidance capabilities offline

Segment 2: The Wearables Debate

The team holds a "Hot Topic" discussion on the wearables they actually use—and the ones they don't.

  • Ray-Ban Meta: Rachel discusses how the glasses assist her during travel, especially when integrated with services like Aira
  • Apple Watch Critique: Joel provides a candid list of why the Apple Watch hasn't "won him over," citing its bulkiness and the fact that it still requires two hands for a blind user to operate
  • AirPods & Mini Guide: David shares his love for AirPods as his primary wearable and highlights the Mini Guide, a handheld sonar device for environmental scanning

Segment 3: Partner Corner – Vispero & JAWS

Matt Ater, Senior VP at Vispero, joins Rachel to announce a major partnership: bringing JAWS to the BrailleNote Evolve.

  • Fully Integrated: This is the same powerful JAWS used on Windows PCs, now licensed directly to the BrailleNote Evolve
  • PC Power: Because the Evolve features dedicated Alt, Control, and Insert keys, navigating complex Windows applications and using JAWS scripts is more intuitive than on a traditional Braille display
  • Complimentary Access: New Evolve customers will receive a six-month complimentary license to get started

The Jim Halliday STEAM Innovation Bursary 2026

HumanWare is proud to announce that the Jim Halliday STEAM Innovation Bursary is now open for 2026! This global program recognizes students aged 13 to 25 who are developing STEAM-related projects that improve accessibility or benefit their communities.

  • The Goal: We are looking for bold questions and projects that merge disciplines to make accessibility part of everyday innovation.
  • Recipients will receive cutting-edge technology, including a Brilliant BI 40X , a Mantis Q40 , or a Prodigi for Windows Complete Kit .
  • How to Apply: Students can apply directly or be nominated by a mentor. Visit the HumanWare News Page for full details

Upcoming Shows & Events

Catch the HumanWare team at these upcoming conferences :

  • April 15-17: PA-DE AER Conference (Harrisburg, PA)
  • April 16-18:** CTEBVI Conference (Los Angeles, CA)
  • April 16-18: Sensor Tech at Zozo (Utrecht, Netherlands)
  • NFB Utah State Convention (Ogden, UT)

Have a story or feedback? Write to us at [email protected]

Find out more at https://see-things-differently-with-hu.pinecast.co