Subject: Public health must diversify which data ‘count’ in AI algorithms
Source: The Hill
Despite AI’s promising potential as a public health surveillance and predictive tool, health professionals across different sectors have expressed expansive concerns about its long-term viability, highlighting examples such as poor model interpretability, insufficient infrastructure, lack of regulation, data sharing, and of course, broader privacy and ethical concerns. Arguably the most troubling concern, however, centers on AI’s potential to incorporate bias into its algorithmic data structure, which could disproportionately affect marginalized populations because of their underrepresentation in scientific research more generally.
Proof of this bias exists. For example, recent evidence of racial bias was found in a commercial algorithm commonly used in U.S. hospitals, such that Black patients were assigned consistently lower risk scores despite being equally sick as their white counterparts. These issues are enough to give industry experts pause with fast-tracking AI implementation into ubiquity.
As an assistant professor in public health at San Jose State University, situated in the heart of Silicon Valley, I have become all too familiar with how “hierarchy of data” debates present in many technocratic-driven spaces such as the National Institutes of Health and the social innovation health sector work to privilege certain data forms over others.
Sample RSS feed: https://thehill.com/social-tags/public-health/feed/
Subject: Why Banks Are Suddenly Closing Customer Accounts
Source: New York Time
[note this link provides free access to the article = no paywall]
Increasing attention to suspicious-seeming transactions has led to some people suddenly losing access to their bank accounts. The reasons are often a mystery.
A rise in suspicious activity reports – With fraudulent activity on the rise and exploding during the pandemic, some banks are taking an even harder look at their customers’ transactions — and closing their accounts when they feel that it’s necessary.
Besides the overall rise in fraudulent activity, several factors could be behind the increase in filings — more alerts from government officials tipping off banks to specific activities, increasingly sophisticated technologies to detect them and more regulatory scrutiny.
Banks can close a customer’s account for any reason, at any time, a point that is buried in the fine print of its customer agreements. When they do dump an account, it’s usually because they’re trying to protect the institution (or the customer) from a potential fraud.
“The big thing I’ve learned here, and I think it’s applicable to a lot of places in our lives — say, if you’re investing money — is that you diversify,” he said. “If all of your credit or money is wrapped up in one bank, it can only benefit them.”
- We’ll tell applicable third parties (like Facebook) who received that information to delete it.
- We’ll never share your health information with applicable third parties (like Facebook) for advertising purposes.
- We won’t share your health information with applicable third parties (like Facebook) for other purposes, unless we get your permission first.
- We’ll put in place a comprehensive privacy program with heightened procedures and controls to protect your personal and health information. An independent auditor will review our program to make sure we’re protecting your information. These audits will happen every two years for 20 years.
To learn more about the settlement, go to ftc.gov and search for “GoodRx”.
For advice on protecting your health privacy, read the FTC’s Does your health app protect your sensitive info?