As a medical doctor I extensively use digital voice recorders to document my work. My secretary does the transcription. As a cost saving measure the process is soon intended to be replaced by AI-powered transcription, trained on each doctor’s voice. As I understand it the model created is not being stored locally and I have no control over it what so ever.

I see many dangers as the data model is trained on biometric data and possibly could be used to recreate my voice. Of course I understand that there probably are other recordings on the Internet of me, enough to recreate my voice, but that’s beside the point. Also the question is about educating them, not a legal one.

How do I present my case? I’m not willing to use a non local AI transcribing my voice. I don’t want to be percieved as a paranoid nut case. Preferravly I want my bosses and collegues to understand the privacy concerns and dangers of using a “cloud sollution”. Unfortunately thay are totally ignorant to the field of technology and the explanation/examples need to translate to the lay person.

    • FlappyBubble@lemmy.mlOP
      link
      fedilink
      arrow-up
      2
      ·
      edit-2
      9 months ago

      My biometric data, in this case my voice. Training an AI, tailored to my voice, out of my control, hosted as a cloud solution.

      Of course there is an aspect of patient confidenciality too, but this battle is already lost. The data in the medical records is already hosted outside of my hospital.

      • SheeEttin@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        ·
        9 months ago

        Sounds like a weak argument. They’re not going to be inclined to operate a local ML system just for one or two people.

        I would see if you can get a quote for locally-hosted transcription software you can run on your own, like Dragon Medical. Maybe reach out to your IT department to see if they already have a working relationship with Nuance for that software. If they’re willing to get you started, you can probably just use that for dictation and nobody will notice or care.

    • 520@kbin.social
      link
      fedilink
      arrow-up
      2
      arrow-down
      1
      ·
      edit-2
      9 months ago

      Not OP but if I were him/her: Leakage of patient data. Even if OP isn’t responsible, simply being tied to an incident like this can look very bad in fields that rely heavily on reputation.

      AI models are known to leak this kind of information, there are news articles all over