Google’s “Voice Access” is decent for controlling the device through verbal commands, but you have to be looking at the screen to get results - it won’t read anything back to you.

Google’s “TalkBack” will read things on screen to you, but you have to interact with the screen physically (never mind the significant change in how interactions work - which I understand the need for - but it’s still a serious mental PITA to switch between the two interaction methodologies frequently).

Is there no way to just interact with it entirely verbally? A (very) simple example of what I’m looking for:

  1. “What are the current Google News headlines?”
  2. Starts reading each one aloud, along with the names of the sources.
  3. “Read the article about Trump caught making out with Elon from AP News.”
  4. Proceeds to load the article & read it aloud.

(Yeah, I know there are podcasts for this - it’s meant to illustrate the basic idea of completely verbal interaction with the device, not be an actual problem I’m looking for someone to provide a solution to.)

It just seems to me that we should be able to do this by now - especially with all the AI blow-up over the past couple of years. Can anybody point me to a usable solution to accomplish this?

TIA.

EDIT: I thought of a better example (I think), because it occurred to me that the above one could (sort of) be done with a Google Home speaker. I’m looking to be able to interact with Android apps verbally wherever possible, so my better example is “What are the latest posts made to the ‘No Stupid Questions’ community on Lemmy?” So far as I know, Google Home is not able to do such a thing. I’d like to tell Android to open my Lemmy client and start reading post headlines until it hit one I wanted to have it open & read to me.

I’m basically looking to use apps verbally to fill in gaps that Google Home/Assistant don’t cover.

EDIT 2: Here’s an even better, more universally applicable description of what I’m after - copied from a response I gave to another comment:

Imagine someone doing some relatively mindless menial job such as working an assembly line, janitorial work, chauffer - something where your mind is relatively unoccupied, but you’re not free to look at and/or touch your device (whether it be due to practicality, or job rules). While doing that job, I want to be able to have the device read and interact with something of interest to me at that moment (ADHD is a fickle mistress), rather than just relying on podcasts with predefined content. Kind of like having someone next to me doing all the interfacing between me and the device.

  • Catoblepas@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    14
    ·
    edit-2
    2 days ago

    Sorry if I’m overlooking something obvious here, but you’re basically asking about accessibility features, right? Whatever settings and features blind users use should let you navigate without looking at it. I realize that’s overly vague but I only know how to get to accessibility features on iOS, sorry!

    Although I will say you probably would need to get used to listening to text at a very high speed to use it at the speed you read.

    • SanctimoniousApe@lemmings.worldOP
      link
      fedilink
      arrow-up
      14
      ·
      edit-2
      2 days ago

      Imagine someone blind who also has Parkinson’s - they can’t see to use Voice Access, and they can’t control their hands well enough to interact with the screen physically in a reliable manner using TalkBack. You can’t actually use those two accessibility features together - they are mutually exclusive in that they require you either be able to see the screen, OR you must be able to interact with it physically as it reads out what you’re touching. Why is there no way to interact entirely verbally?

      ETA: please see my added example in OP.

      • Catoblepas@lemmy.blahaj.zone
        link
        fedilink
        arrow-up
        6
        ·
        2 days ago

        Someone who knows more about how to/if you can do this on Android will have to answer the specifics for that; I know on iOS you can use custom voice control actions (along with default voice control phone navigation mode) to do more or less what you’re describing. I’m surprised if Android has no accessibility features that work similarly.

        • SanctimoniousApe@lemmings.worldOP
          link
          fedilink
          arrow-up
          6
          ·
          2 days ago

          iOS will open your Lemmy client, start reading posts to you aloud, and go into a post of interest upon command without you ever looking at or touching the screen (using my newer example that I added to the OP)? I’m seriously going to have to look into getting an iDevice of some sort if so.

          • Catoblepas@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            6
            ·
            edit-2
            2 days ago

            I don’t know if this is still the case, but I know years back iPhones were preferred by a lot of blind people just in terms of accessibility. Digging through accessibility settings, it looks like you can use Voice Control to tell it to open Lemmy, and VoiceOver to read all the text on the screen without touching it. I don’t know about the example you added to your OP, adding phrases it would need to interpret seems more like a Siri thing (which I don’t use), so I don’t know how well that plays with Voice Control.

            I wouldn’t rush out to buy anything unless some Android people confirm it’s not doable. Apple does have people that know the software working at their stores, so they could tell you specifics for sure. And check that I’m not totally wrong, lol.

            • SanctimoniousApe@lemmings.worldOP
              link
              fedilink
              arrow-up
              5
              ·
              2 days ago

              Yeah, I wouldn’t just jump in without looking first. If I can’t find a way to do this, then I’m definitely gonna have to take a trip to the nearest Apple Store, though. Thanks very much for the input!

  • Skezlarr@aussie.zone
    link
    fedilink
    arrow-up
    2
    ·
    1 day ago

    I might be misunderstanding the question, so if you’re looking to make an existing android device you have work like this, please ignore the below.

    Maybe a Google Nest Mini would do what you’re looking for? It’s definitely got the ability to be interacted with entirely verbally, but I guess the downside is a smaller selection of apps that do work with it (there is a list on the Google store of which apps it works with)

    • SanctimoniousApe@lemmings.worldOP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      18 hours ago

      I have a few Google Home Minis, and so far as I know they are the same thing - just rebranded. It’s basically the same thing as Google Assistant built into every Google-approved/equipped (meaning it has their full suite of apps pre-installed) device. They’re just so limited. I know of no way to get them to read Lemmy posts as in my added second example.

      I also thought of another use case for what I’m after that might be more universally applicable and easily understood. Imagine someone doing some relatively mindless menial job such as working an assembly line, janitorial work, chauffer - something where your mind is relatively unoccupied, but you’re not free to look at and/or touch your device (whether it be due to practicality, or job rules). While doing that job, I want to be able to have the device read and interact with something of interest to me at that moment (ADHD is a fickle mistress), rather than just relying on podcasts with predefined content. Kind of like having someone next to me doing all the interfacing between me and the device.

      (EDIT: minor swipe keyboard corrections.)