Pete Recommends – Weekly highlights on cyber security issues August 17, 2019

Subject: Social Security sent out some wrong estimates. What you need to know
Source: Detroit Free Press via  Yahoo

Who would imagine that you could get a wrong estimate of your Social Security benefits from — guess who — the Social Security Administration?

The story is yet another reminder about how you need to double-check your numbers, take a hard look at what you’re being told and keep good records of previous Social Security statements.

In an odd quirk, Social Security finally acknowledged in early August that a glitch caused the agency to send out some incorrect “On Request” paper statements. The troubled reports were triggered if you asked for information on your Social Security account via a paper form, known as an SSA-7004. 

“Of the tens of thousands of paper requests the agency receives annually, less than 1% of those statements issued contained errors,” according to an email from Mark Hinkle, acting press officer in the national Social Security office.

Subject: Facial Recognition Software Prompts Privacy, Racism Concerns in Cities and States
Source: The Pew Charitable Trusts – Stateline

Fabian Rogers was none too pleased when the landlord of his rent-stabilized Brooklyn high-rise announced plans to swap out key fobs for a facial recognition system.

He had so many questions: What happened if he didn’t comply? Would he be evicted? And as a young black man, he worried that his biometric data would end up in a police lineup without him ever being arrested. Most of the building’s tenants are people of color, he said, and they already are concerned about over policing in their New York neighborhood.

“There’s a lot of scariness that comes with this,” said Rogers, 24, who along with other tenants is trying to legally block his management company from installing the technology.

“You feel like a guinea pig,” Rogers said. “A test subject for this technology.”

Amid privacy concerns and recent research showing racial disparities in the accuracy of facial recognition technology, some city and state officials are proposing to limit its use.

[How do you have house-guests? /pmw1]

“Our industry certainly needs to do a better job of helping educate the public how the technology works and how it’s used,” said Jake Parker, senior director of government relations for the Security Industry Association, a trade association based in Silver Spring, Maryland.

A positive match from facial recognition software is not sufficient to charge a suspect with a crime, Craig said.

Topics: Safety Net & Business of Government

RSS selector, both TOPICS as well as PROJECTS:

Subject: Over 40 Windows Hardware Drivers Vulnerable To Privilege Escalation
Source: Bleeping Computer

Researchers analyzing the security of legitimate device drivers found that more than 40 from at least 20 hardware vendors can be abused to achieve privilege escalation.

Hardware represents the building blocks of a computer on top of which software resides. Drivers are what allows the operating system to identify the hardware components and interact with them.

Driver code enables communication between the OS kernel and the hardware, enjoying a higher permission level than the normal user and the administrator of the system.

Therefore, vulnerabilities in drivers are a serious issue as they can be exploited by a malicious actor to gain access to the kernel and get the highest privileges on the operating system (OS).

Since drivers are also used to update hardware firmware, they can reach components operating at an even deeper level that is off-limits for the OS, and change the way they function, or brick them.

BIOS and UEFI firmware, for instance, are low-level software that starts before the operating system, when you turn on the computer. Malware planted in this component is invisible to most security solutions and cannot be removed by reinstalling the OS.

Subject: Hackers Can Turn Everyday Speakers Into Acoustic Cyberweapons
Source: WIRED

Speakers are everywhere, whether it’s expensive, standalone sound systems, laptops, smart home devices, or cheap portables. And while you rely on them for music or conversation, researchers have long known that commercial speakers are also physically able to emit frequencies outside of audible range for humans. At the Defcon security conference in Las Vegas on Sunday, one researcher is warning that this capability has the potential to be weaponized.

It’s creepy enough that companies have experimented with tracking user browsing by playing inaudible, ultrasonic beacons through their computer and phone speakers when they visit certain websites. But Matt Wixey, cybersecurity research lead at the technology consulting firm PWC UK, says that it’s surprisingly easy to write custom malware that can induce all sorts of embedded speakers to emit inaudible frequencies at high intensity, or blast out audible sounds at high volume. Those aural barrages can potentially harm human hearing, cause tinnitus, or even possibly have psychological effects.

“I’ve always been interested in malware that can make that leap between the digital world and the physical world,” Wixey says. “We wondered if an attacker could develop malware or attacks to emit noise exceeding maximum permissible level guidelines, and therefore potentially cause adverse effects to users or people around.”

filed under:


bonus from Defcon:

Subject: New Student Privacy Report Gives Guidance on Algorithms in K-12 Education
Source: Center for Democracy & Technology (CDT) via LJ infoDOCKET

From the Center for Democracy & Technology (CDT): Some K-12 school districts are beginning to use algorithmic systems to assist in making critical decisions affecting students’ lives and education. Some districts have already integrated algorithms into decision-making processes for assigning students to schools, keeping schools and students safe, and intervening to prevent students from dropping out. There is a growing industry of artificial intelligence startups marketing their products to educational agencies and institutions. These systems stand to significantly impact students’ learning environments, well-being, and opportunities. However, without appropriate safeguards, some algorithmic systems could pose risks to students’ privacy, free expression, and civil rights.

To address these considerations, education leaders and the companies that work with them should take the following actions when designing or procuring an algorithmic system:

NB CDT topic on Privacy & Data:


[only one current item, but maybe your news aggregator has aggregated the older ones? (mine has not) /pmw1]

Subject: IRS Security Summit Series for Tax Professionals: Create a Data Theft Recovery Plan
Source: DHS via CISA

[Ed. note: obviously, not just for Tax Pros … ]

The fifth and final step in the Internal Revenue Service (IRS) Security Summit series for tax professionals is creating a data theft recovery plan. IRS issued a news release highlighting the importance of understanding the risks posed by national and international cybersecurity criminal syndicates, working with cybersecurity experts to help prevent and stop data theft, and reporting data theft as soon as possible. Creating a data theft recovery plan is part of the Taxes. Security. Together. Checklist, which IRS created to help tax professionals protect sensitive taxpayer data.

The Cybersecurity and Infrastructure Security Agency (CISA) encourages tax professionals to review the IRS news release and the following Security Summit series topics for more information:

NB other CISA Current Activities:

[bonus point if you find the RSS feed 🙂 /pmw1]

Subject: Senators Call for Closing “Loopholes” That Make Health Care Fraud Easy
Source: ProPublica

The letter came in response to a recent story by ProPublica and Vox that traced the brazen scam of a Texas personal trainer, who despite having no medical credentials was able to submit a blizzard of fake bills with some of the biggest insurance companies in the country and recoup millions. The story revealed not only how David Williams exploited weaknesses at each step, but how slowly the insurers responded to his ongoing fraud.

Williams’ con, for which he was later prosecuted, was initially enabled by the Centers for Medicare and Medicaid Services. The federal agency issues and administers National Provider Identifiers, or NPIs, the unique numbers medical providers need to bill insurance plans. ProPublica found that Medicare doesn’t check the credentials of medical providers who apply for NPI numbers, such as whether they have valid licenses, which means scammers can lie to obtain them. Williams obtained at least 20 NPI numbers and used them to bill insurers.

Current and former investigators for health insurers and federal prosecutors said the simplicity of Williams’ scam, combined with weak oversight and enforcement, raises questions about what’s being done to combat rampant health care fraud. Experts estimate fraud consumes about 10% of the country’s $3.5 trillion health care tab — although the story found that large swaths of fraud are not being tracked. Those losses are eventually passed on to the public in the form of higher monthly premiums and out-of-pocket costs as well as reduced benefits.

Subject: More than 1 million people had their fingerprint data exposed by a huge security hole
Source: BGR

The easiest thing you can do to protect yourself after a data breach is to change the exposed password, preferably to something unique and more secure than the previous one. However, if hackers steal your actual fingerprint — mind you, not an encrypted version of it — then there’s really nothing you can do to prevent anyone from abusing it since you obviously can’t change your fingerprints. And that’s precisely what could have easily happened with Suprema’s Biostar 2 biometrics lock systems.

Researchers found a security hole in Suprema’s system that let them access authentication data belonging to more than 1 million users. The data includes fingerprints, facial recognition information, unencrypted usernames and passwords, and personal information of employees, according to The Guardian.

Israeli researchers Noam Rotem and Ran Locar worked with vpnmentor to find the vulnerability. Once they gained access to the Biostar 2 database, they found it to be unprotected and mostly unencrypted. They found it easy to access more than 27.8 million records totaling over 23GB in size.

Tags: biometrics, Suprema

sample RSS:

Posted in: Big Data, Civil Liberties, Congress, Cybercrime, Cybersecurity, Financial System, Government Resources, Legal Research, Privacy, Social Media