
Apple has introduced new accessibility features that will be very useful for iPads and iPhone.
Apple always takes care of those who will use its products.
The main purpose of these features is for people with disabilities. Apple said that after studying step by step how to add these features, only useless systems are included.
Live Speech is mainly released for people who have difficulty speaking.
Personal Voice is convenient for people who find it difficult to speak for a long time. If you record 15 minutes of voice and put it in, the phone will definitely remember what you said.
Detection Mode is for people with weak eyesight. Using the LiDAR sensor to read and show hand-pointed messages. This will be done through the Magnifier App, which will inform you about what you can touch.
Assistive Access will change the entire iOS UI to a simple and easy-to-use format, showing only the Camera, Phone, Message, Music, Photos, etc. that you don’t want to use.
Apple has said that this feature will be available later this year, so we are hearing a lot of news that it will be in iOS 17.
I have to use these things as UI elements/features. I can’t say that a UI is full of features. Like Apple, they have thought about all aspects, and I thought that even if it is convenient for multi-use, it has to be defined as good.
Thank you for reading to the end.
