I mean theoretically things could be anonymized for the AI, with only the charts with identifiers present, I'd imagine assuming the AI itself stays locked into the EHR system and isn't say outsourced to one of the big AI firms. With those conditions it's, roughly the same privacy as the existance of EHR in general.
As far as practical/legal/ethical, that comes down to how they market it to doctors. Personally I think it could be a useful tool for a doctor to "second opinion" himself. IE reach his own conclusion first, then hit the AI, see if it noticed something he missed. Though the obvious fear is of course going to be lazy or rushed doctors, working in a hospital that's pushing them to see the most patients possible in an hour, rewarding the doctors that walk in, hand the patient their AI diagnosis, and move into the next room. Which... well in modern America we all know this is what's going to be pushed.
The tools have amazing potential when used appropriately.... but for profit healthcare has all the wrong incentives, and they will see this as a tool to magnify them.