AppleVis is the go-to resource for blind and low vision users of Apple technologies. Our Podcast discusses the latest in Apple vision accessibility. Topics cover OS and accessibility features, apps, interviews with developers, roundtable discussions, and more - centering around accessing the maximum potential of Apple hardware, software, and services. Tune in to learn how you can get the most out of your Apple devices, hear the latest accessibility news, and more.
In this episode, Thomas Domville demonstrates how to use GenMoji on iOS. GenMoji, introduced in iOS 18.2, is an AI-powered feature that allows users to create custom emojis simply by describing them in text. This innovative tool enables personalized and unique emojis, going beyond the standard set to enhance self-expression in messaging and other apps.
Update Your iPhone:
Enable Apple Intelligence:
Access GenMoji:
Create Your GenMoji:
Device Compatibility: GenMoji is available on iPhone 15 Pro, iPhone 15 Pro Max, and the iPhone 16 lineup and newer.
Disclaimer: This transcript was generated by Aiko, an AI-powered transcription app. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello and welcome.
My name is Thomas Domville, also known as AnonyMouse.
I'm going to be talking to you today about a feature called Gemmoji.
I know, I know.
There's so many different emoji, Memoji, Gemmoji.
Thanks to Apple.
I know.
I hear you.
But that's what I'm here for.
I'm going to explain what Gemmoji is and how that's different from other emojis that you find throughout your iOS.
Now this is a really kind of a nifty kind of feature that I'm starting to like quite a bit.
So all of you probably are familiar with emoji, right?
So emojis are little characters, little design, this little artwork is very small and that you can put into your messages and to your WhatsApp and all sorts of different variety of ways of communication.
So you can send a little picture of something.
So usually they're kind of basic stuff like cows and food things like tomatoes.
And you probably know more the popular ones like a smiley face emoji or something or heart eyes emoji and things like that, red heart.
Those are emojis and I know there's plenty of those emojis, right?
There's like a thousand of these things, right?
However, sometimes you're going to find that there isn't quite the emoji that you want.
Let's just say I want a mouse wearing sunglasses, giving a thumbs up.
You probably not going to find that emoji, but now we are able to design and customize your own emoji called Gemmoji.
That allows you to create any emojis that you want to your heart consent.
The sky is the limit and what you can create, but I'll show you some quirks and some setbacks that you're not…
In this month's edition of Apple Crunch, Thomas Domville, Marty Sobo, and John Gassman discuss recent Apple news and other topics of interest.
Chapters:
Resources:
If you have feedback or questions for the Apple Crunch team, you can reach them at [email protected]
Disclaimer: This transcript was generated by Aiko, an AI-powered transcription app. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello and welcome to Apple this is Apple crunch and my name is Thomas Domville also known as AnonyMouse And along with me.
I found a couple guys here that came along with me to do this month Apple crunch.
Let's bring in Marty How you doing Marty?
Hey Thomas, how's it going?
I'm good.
Thank you.
I like that.
You're quick simple and fast Just like that bit it did it And and then we have John gaspman on the other end how you doing John just to be different I'll say that I really suck today Really?
Well, I was kind of I don't know you don't I don't think you do.
No, it's been a wonderful month I'm just if Thanksgiving time here in the states and getting the…
In this episode, Tyler demonstrates the basics of using iPhone Mirroring, a feature that allows you to use your iPhone from your Mac, with VoiceOver. Topics covered in this demonstration include:
More general information about this feature can be found in the Apple Support article "iPhone Mirroring: Use your iPhone from your Mac."
Disclaimer: This transcript was generated by Aiko, an AI-powered transcription app. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hey Apple visitors, Tyler here, with a demonstration and overview of iPhone mirroring. iPhone mirroring is a feature that allows you to essentially use your iPhone from your Mac.
This may be useful if you, for example, prefer a given service's mobile app over its desktop or web equivalent, or if what you're using doesn't have a desktop app or website.
In addition, as notifications from your iPhone can be forwarded to your Mac via iPhone mirroring, you may find it more convenient to receive and respond to the notification on your Mac than to switch devices if you're working on your Mac when the notification comes in.
With the initial release of Mac OS Sequoia 15.0, this feature was completely inaccessible with voiceover.
With the subsequent 15.1 update, it has become accessible to a point.
I say to a point because at the time of recording, October 2024, there are several issues that, in my opinion, make the experience less than refined from a voiceover perspective, which I'll elaborate on later in this demo.
To use iPhone mirroring with voiceover, you'll need an iPhone with iOS 18.1 or later and a Mac with Mac OS 15.1 or later.
All iPhones capable of running iOS 18 work with iPhone mirroring, as do all Macs capable of running Sequoia, with the exception of the 2019 iMac, as that model lacks the T2 security chip or Apple silicon.
To set up iPhone mirroring, just open the iPhone mirroring app on your Mac and follow the on-screen instructions.
I've already done that, so now I'm going to demonstrate the feature.
I'm going to open iPhone mirroring on my Mac.
So here I am prompted to authenticate.
By default, whenever you open the iPhone mirroring app, you are prompted to authenticate as a way to verify your identity before it gives you control of your iPhone.
If you'd rather, it didn't prompt you to authenticate each time you open the app, you can change this in iPhone mirroring settings.
Just choose settings from the menu bar or press command comma and it should be there.
But I'm going to authenticate with touch ID now.
Close button.
Okay, so here I am in the iPhone mirroring window.
Okay, so we have home screen and app switcher.
Now you can use those buttons to get to those locations, or I find it more convenient to use the view menu or keyboard shortcuts.…
Welcome to AppleVis Extra 101, where Dave Nason is joined by Xiaoran Wang and Huasong Cao from Agiga, the team behind the upcoming Echo Vision smart glasses. Check out some early demos, with more to come, on their YouTube channel at: https://www.youtube.com/@AgigaAi/videos And learn more on their website at: https://echovision.agiga.ai/ The team would love to hear your feedback, so please comment below, or contact them through the website.
Disclaimer: This transcript was generated by Aiko, an AI-powered transcription app. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello there and welcome to the AppleVis Extra.
This is episode number 101.
My name is David Nason and I'm delighted to be joined by two fantastic guests from Agiga.
We have Xiaoran Wang and HuaSong Cao
Is that correct, guys?
Thank you, David.
Thank you for having us.
So yeah, delighted.
And the product you're going to talk to us about is the Echo Vision.
So before we get into that, though, do you want to tell us a little bit about yourselves, the company, what you do, really, and how it all came about?
Sure.
My name is Xiaoran.
I'm the CEO, and before doing this company, actually, I had long experience building and shaping intelligent devices.
I started my career at Amazon Level 26.
That's the first place of the world's first Kindle and also the world's first Alexa.
And I was actually in the early team that developed Alexa devices.
And this experience gave me good exposure and got me enough experience on how to build a good intelligent device.
I think that's one of the confidence I brought in when I was doing this company that we can build a great product for the community.
Brilliant, and obviously, Alexa, a device that's very popular, being a voice-first product, very popular in the blind and visually impaired community.
So yeah, that's really cool.
And tell us about yourself, Hua Song.
Yeah, thanks, David.
So my name is Hua Song, and I'm an engineer by training.
Before I started this company with Xiaoran, I was with Google for about nine to ten years.
I did various software projects, and the latest one is Google Assistant, the voice assistant similar to Amazon counterparts.
Yeah, I've been enjoying doing software, building stuff, both hardware and software.
And with Giga, what we are trying to do is really to use our expertise built prior to this company and take that to something that we can really use to help everyone.
That's brilliant.
So do you want to quickly tell us what the product is, and then we can delve into a bit more detail.
Yeah, so the product is called EchoVision.
By looking, it looks like a normal pair of glasses.
The key feature is to articulate visual information into voices.
Think about it.
If you can't see this, and there's someone next to you, how will this person help you?
So basically, he or she is going to describe it for you, like read it out for you.
And that's how we envision our product, basically.
It's like an assistant, a virtual assistant that does this work for you.
Well, I was going to…
In this episode, Thomas Domville demonstrates how to use Writing Tools in iOS 18.1. Writing Tools, released as part of the first set of Apple Intelligence features in iOS 18.1 is an impressive feature that allows users to rewrite text in different tones, such as friendly, professional, and concise, with an added proofread option.
To use this feature, open the text you want to edit and highlight it. Within the Edit section of your Rotor, you'll find the Writing Tools option. Different tone options will appear for you to choose from. Select the desired tone, and iOS will automatically adjust the text accordingly. If you opt for the proofread option, it will check for grammatical and stylistic errors, providing suggestions for improvement. This feature makes it easier to tailor your writing for different contexts and ensure it's polished and effective.
Disclaimer: This transcript was generated by Aiko, an AI-powered transcription app. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello and welcome.
My name is Thomas Domville, also known as AnonyMouse.
I'm going to be talking to you about an Apple Intelligent feature called the writing tools.
Now this is one of my most favorite feature of the Apple Intelligent that Apple has to offer.
Now in order to be able to use the Apple Intelligent, you must need the following iPhone criteria, which is iPhone 15 Pro, iPhone 15 Pro Max, or newer devices.
If you have those devices, then you will earn luck.
The first thing we need to do is make sure that Apple Intelligent is turned on.
Simple enough to turn it on, you'll have to head over to your settings.
So let's head over to settings, and let's do one finger double tap and open this.
And now what we're looking for, we're looking for Apple Intelligent in Siri, so swipe to the right until you find that button.
One finger double tap, open that.
Now you may have heard the word beta.
Well, at this time of the podcast that's being recorded, it is in beta.
So Apple Intelligent is in beta at this moment.
Now if you don't hear the beta, don't worry.
That might mean that the beta is no longer in beta.
So to make sure we are in the same place in this right area, let's go to the very top left hand corner where you have the back button.
To get to there, you can either tap at the top left or use a forefinger tap on the top half of your device until you hear the back button.
Now if you swipe to the right, that's where you heard the beta.
Now in your case, if you don't hear the beta, have no worries, let's keep going to the right.
Now we're into the Apple Intelligent section.
Now if you go to the right here, a personal intelligence system integrated deeply into your iPhone apps and Siri, learn more link, use the rotor to access links.
You can do that if you wish.
If you want to find more information about the Apple Intelligent and all the features that has to offer, double tap that and I'll take you to the web page at Apple to talk to you about more about Apple Intelligent.
But what we're looking for is the option to turn the Apple Intelligent on and that should be the next item up.
If you go to the…
In this episode, Thomas Domville demonstrates the call recording and transcription feature in iOS 18.1. This update brings an exciting built-in call recording capability to iPhones. Recording a call is straightforward: just double-tap the Record button in the top left corner during a call (note that you might need to hide the keypad to see the button). A voice message will notify all participants that the call is being recorded. These recordings are saved directly in the Notes app, which uses end-to-end encryption to keep your conversations private. To review a call, simply go to the Notes app, and you'll find your recordings securely stored there. Additionally, iOS 18.1 offers automatic transcription, generating real-time captions and notes during your call. This feature makes it easy to reference important points later, with transcriptions handled entirely on your device to ensure privacy, with no data sent to external servers. Furthermore, supported iPhone models can provide summaries of these recordings, making it even easier to keep track of your conversations.
Disclaimer: This transcript was generated by Aiko, an AI-powered transcription app. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello and welcome.
My name is Thomas Domville, also known as AnonyMouse.
I want to show you a feature called call recording and transcription.
This is a beautiful feature that allows you to record your phone call.
And yes, you're able to get a transcription of what transpired within that conversation of the phone call.
Very useful for any time that you want to record a meeting or conversation.
Maybe perhaps you're doing an interview with somebody or just simply just want to record for keepsake.
Whether there's somebody close to your family you just want to have for record, for archival, or maybe you have a business transition or phone call you just want to make sure it's recorded.
Now a couple things I want to note about this feature is that I always like to let people know that I'm about to record them.
So it's kind of a respectful kind of a thing and not to get them caught off guard.
So some people can maybe kind of leery about those kind of things that you just all of a sudden you just pop on them that you're recording their phone calls and such.
Make sure you tell them that you're about to record them.
Now even though Apple put a great safety feature in this and if you begin the recording no matter what you do it's going to let that recorder know that the phone call is being recorded.
So that is a nice feature that Apple ensured that everybody knows that the phone call is being recorded.
Now before you can use this feature you'll have to turn this option on and that's within the settings.
So let's head over to settings.
Settings.
Double tap to open.
You one finger double tap on settings.
Settings.
Now what we're looking for and we're looking for the phone app.
Now the phone app now lives in the app section.
Yep.
So at the very bottom of the settings so I'm going to do a four fingers tap at the bottom half of my phone here.
Apps.
Button.
Which will directly to get me to the app…
In this podcast, Thomas Domville showcases the Audio Ducking feature on iOS. Audio Ducking automatically reduces the volume of background audio when a foreground sound, such as a notification or VoiceOver, is played. This feature is particularly beneficial for accessibility, ensuring that important sounds or speech are clearly audible without being overshadowed by other audio.
To adjust the Audio Ducking settings on iOS, follow these steps:
Disclaimer: This transcript was generated by Aiko, an AI-powered transcription app. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello and welcome.
My name is Thomas Domville, also known as AnonyMouse.
I'm gonna be talking to you today about audio ducking.
Now some of you out there are familiar with audio ducking and some that are not.
So let's talk about what audio ducking for those that are not familiar with this feature.
So really what this audio ducking does, it's very useful when you need it, when you want to lower the media sound.
So for example, are you on a phone call and you're talking to an operator or a customer service or whatever it might be, and they want you to check your email and make sure you got something, read it off something, or they sent you a text for verification, whatever not.
Sometimes it's really, really hard to hear that voiceover voice, right?
So having it at the same time, the voiceover can be difficult to hear.
Sometimes it's lower than what their sound is.
Or in my case, I do a lot of music.
So I love listening to the music in the background.
However, if I'm trying to do something with voiceover on the same time, it's difficult to hear voiceover.
So let me give you an example of what it sounds like when you do not have audio ducking on.
So if I turn on music here, I'm just gonna swipe back and forth so you can hear the various sound of voiceover and you'll see what I mean.
It's about the same or if not a little bit less, it just depends on the situation.
So here's the music.
Make sure that what I tell you makes sense.
Mail, no unread emails.
Messages, one unread message.
So it's hard to hear, right?
So you don't have that ability to be able to hear it very well.
Well, audio ducking, by turning this on, it allows you to duck those media or situation that you're in so you can bring up that voiceover on top and make it clear and concise.
Now, for those that are familiar with audio ducking, well, the old fashioned audio ducking allows us to be able to duck, right?
But it was a hard set, meaning that we have no…
In this month's edition of Apple Crunch, Thomas Domville, John Gassman, and Marty Sobo discuss recent Apple news and other topics of interest.
Topics featured in this episode include:
Links:
If you have feedback or questions for the Apple Crunch team, you can reach them at [email protected]
Disclaimer: This transcript was generated by Aiko, an AI-powered transcription app. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello and welcome to Apple Crunch for September 2024.…
In this podcast, Thomas Domville reviews and demonstrates the Voices feature, which allows you to customize multiple VoiceOver voices to suit your needs. You can quickly access these voices using the Rotor Actions or the VoiceOver Quick Settings.
How to Add VoiceOver Voices to the Voices Feature on iOS
Now, you can easily switch between your customized VoiceOver voices to enhance your accessibility experience on iOS 18.
Disclaimer: This transcript is generated by AIKO, an automated transcription service. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hello and welcome.
My name is Thomas Domville, also known as Anonymouse.
I'm going to be talking about a feature called Voices.
So as you know, we have our primary voice over voice that we use each and every day on our device.
Wouldn't it be great that you are able to access various voices more than just one voice over on the fly?
Yep, you can do that.
It lives right in your rotor, if that's where you would like it to be.
In my case, I have it in my rotor itself.
You can also put that in the voice over quick settings if you wish to.
And I'll be showing you how to add that to your rotor and quick setting if that's something you want to do.
But in my case, whenever I do a podcast, you probably always hear that I use Siri number 4 voice, in short is Noel.
And this is what I use when I do podcasting, but every so often I like to change things up and here's some other voice and that would be Tom and Hans.
So those are my top two voices.
And so in order to access it so quickly and easily, I place the voices in within my rotor.
So let me give you an example of what it sounds like and what it looks like.
So I'm going to access my rotor and I'm going to go to Voices.
Voices, Siri voice 4, default, selected.
So if I swipe up, Tom, primary voice.
I have the Tom primary voice.
Or if I could just swipe up again, Siri voice 4, default.
I'm back to Siri voice number 4.
So this is when I'm talking about how you are able to access voices so easily from your rotor or your quick settings, if that's what you choose to do.
So let me show you how I got that set up.
But before we can do anything, we need to add voices so you can…
In this episode, Tyler demonstrates how to customize the lock screen on iOS, specifically how to remove the flashlight and camera buttons and replace them with other controls.
In addition to viewing the time, date, and notifications, the Lock Screen can be customized to remove or replace the camera and flashlight buttons with other controls, or show certain types of information at a glance, such as upcoming calendar events or current weather conditions. To customize the Lock Screen, perform a one-finger triple-tap on either the time or date, double-tap Customize, and then double-tap “Customize Lock Screen wallpaper.” From here, you can double-tap the Remove buttons for default controls, the “add quick action” button to select alternative controls, or the “Add widget” button to select a widget.
Disclaimer: This transcript is generated by AIKO, an automated transcription service. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.
Hey, Apple Vissers, Tyler here with a quick tip for how to customize the lock screen on iOS.
By default, the iOS lock screen includes the time, date, any notifications received since the device was last used, and at least on devices without a home button, shortcuts to the flashlight and camera functions.
Over the years, the iOS lock screen has gradually become more customizable, with the ability to add widgets introduced with iOS 16 in 2022, and the ability to remove the camera and flashlight functions or replace them with other controls the user might find more useful introduced with iOS 18 in 2024.
If, like me, you don't find the camera or flashlight functions particularly useful, or at least not useful enough to where you would want them to be among the first things you see when you wake your iPhone, you can replace them with other things you might find more useful.
So for me, I replace them with a shortcut to the alarm and also a single action shortcut that I created to set a 20-minute timer.
So when I'm about to work out, I just take out my phone, wake it, unlock it, and double tap the workout timer button on the lock screen.
And when I want to set an alarm, I don't have to go into Control Center or open the clock app or use Siri anymore.
I just double tap the alarm button on the lock screen and I'm taken right there.
So to demonstrate this, I'm going to wake my iPhone now and I'm just going to explain so I don't have to explain while voiceover is talking and compete with that sound.
Once I unlock it, I'm going to triple tap either the time or the date.
Either one works.
You can triple tap or you can double tap and hold either one.
So I'm going to wake my iPhone now.
Do not disturb Friday 1 a.m. Okay, triple tap.
Astronomy wallpaper weather sunrise and sunset widget and clock next alarm widget button and illustration of red, blue and yellow rectangle.
Okay, so if I swipe left astronomy, that's the first element on the screen.
It's the current wallpaper I have.
You can have multiple.
So if you want to have different lock screens, like, for example, if you're working, you might want access to different types of information than if you're just on your own time.
If you want to link focuses, you can do that.
So if you have a work focus, you can have it…
In this episode, Tyler demonstrates some of VoiceOver's command customization capabilities on macOS.
If you find a particular VoiceOver command difficult to perform, or discover a function in the Commands menu that doesn’t have a default command, you can assign your own custom command to it. In addition, you can configure commands to open apps and run scripts, so you don’t have to locate them manually.
Commands can be configured by going to VoiceOver Utility > Commands, selecting the “Command set: user” radio button, and clicking “Custom commands edit.” For ease of navigation, you can choose the type of commands you want to view or change, such as numpad, trackpad, keyboard, etc from the "Filter commands" popup menu, or use the search field to locate a particular command.
In this dialog, commands can be presented in either column view, which organizes commands into categories like general, information, and navigation, or table view, which displays a list of all VoiceOver commands, including user-configured ones, which you can navigate with the up and down arrow keys. To add a command, in column view, locate the command, interact with the table of assignments, and specify your new one using the "Add input" popup menu. To add a command when in table view, click the Add button, interact with the table, and specify the input assignment from the popup menu labeled "None: edited." Then, press VO-Right-Arrow passed an empty cell to another popup menu, and choose the command you want your new input assignment to perform.
Disclaimer: This transcript is generated by AIKO, an automated transcription service. It is not edited or formatted, and it may not accurately capture the speakers’ names, voices, or content.: Hey, Apple Vissers.
Tyler here.
With a demonstration and walkthrough of VoiceOver command customization on macOS.
Prior to macOS Sequoia, VoiceOver included several user configurable sets of commands, known as commanders, for the numpad, trackpad, keyboard, and quick nav.
With macOS Sequoia, these commanders have been consolidated into VoiceOver's broader command set, meaning in addition to the existing modifiers that you could use, like the option key for keyboard commander, you can also create your own command assignments using the VoiceOver modifier, which may be useful if you find a particular VoiceOver command difficult to perform, if you find a command in the commands menu, for example, that lacks a default assignment, or if you want to create a custom command to open an application.
So to demonstrate this, I'm going to open VoiceOver utility on my mac with VoF8.
Opening VoiceOver utility.
VoiceOver utility.
VoiceOver utility.
Window.
Utility categories.
C for commanders.
Commands.
Commands.
VRA.
VoiceOver modifier.
Control option or caps lock.
VoiceOver modifier.
VoiceOver modifier.
This is the setting that was located in the general category in prior versions of macOS, but options are the same.
Control option, caps lock, or control option, or caps lock, which is the default.
VRA.
Also control VoiceOver with.
Also control VoiceOver with.
Numpad.
Uncheck.
Checkbox.
Numpad, which was formerly known as Numpad Commander.
If you want to use, if you have a…
Your feedback is valuable to us. Should you encounter any bugs, glitches, lack of functionality or other problems, please email us on [email protected] or join Moon.FM Telegram Group where you can talk directly to the dev team who are happy to answer any queries.