Taylor Armerding, Synopsys Software Integrity Group senior strategist, gives you the scoop on application security and insecurity in this week’s Security Mashup.
CamuBot malware is the new kid on the block, the sounds of hacking (SonarSnoop), and back to the government’s wish for chat backdoors. Watch this week’s episode below.
via Tom Spring, Threatpost: Banking malware variants are created to hide. The longer they remain in your system undetected, the more credentials and account information they can steal, which attackers use to plunder your savings and investments. But the newest thief on the block, as several outlets have reported this past week, doesn’t try to hide. Instead, CamuBot malware relies on camouflage. Watch this segment to learn why it’s trending in security.
via Catalin Cimpanu, ZDNet: For some time now, the good guys have been touting biometrics as a good way to improve the security of our devices. So it shouldn’t be a surprise that the bad guys are interested in using biometrics to undermine security. Fortunately, an acoustic side-channel attack labeled SonarSnoop aimed at stealing the unlock patterns from Android devices has been reported by some good guys—researchers from universities in the U.K. and Sweden. Watch this segment to learn why this story is trending.
via Hannah Boland, The Telegraph: The battle between tech companies and law enforcement over encryption has been going on for some time now. Law enforcement officials say they support encryption and respect for personal privacy—except when it comes to criminals and terrorists. They say they can’t do their jobs and keep us all safe if they are blinded by encryption. But they claim that mandatory chat backdoors will help. Discover why this segment is trending when you watch it.
Hello, and welcome to Episode 18 of the Weekly Security Mashup. I’m Taylor Armerding, senior security strategist with the Synopsys Software Integrity Group, back again to talk about what’s trending in software security and insecurity, including how to improve your own security.
So, at the top of this week, Page 1: New kid on the malware block. Banking malware variants—hundreds if not thousands of them—generally have one thing in common: They are created to hide. The idea is, the longer they can remain in your system undetected, the more credentials and account information they can steal, which attackers then use to plunder your savings and investments.
But the newest thief on the malware block, as several outlets reported this past week, doesn’t try to hide. It relies on disguise, or camouflage, and has therefore been labeled CamuBot—get it?—by IBM X-Force researchers, who said in a blog post last week that it masquerades as a required end user security module provided by a bank.
The researchers said CamuBot appeared last month in Brazil, targeting business users of major banks. They said it “mixes social engineering and malware tactics to bypass strong authentication and security controls.” That social engineering includes using bank logos and overall brand imaging to appear like a security application. If victims fall for it, they end up running an installation wizard for a Trojan horse instead of a security application.
And it is launched using a very sociable touch—a phone call. After casing out businesses that use certain banks, the attackers call the person most likely to have the account credentials. They identify themselves as a bank employee and tell the victim to browse to a certain URL to make sure their security module is up-to-date. When the validity check fails—surprise!—the attackers then helpfully guide the victim through installing a new “security module.”
That module then insulates itself from detection by firewall or antivirus and sets up port forwarding, which lets the attackers direct their own traffic through the infected machine and use the victim’s IP address when accessing the compromised bank account. A victim who trusts them can even end up sharing biometric authentication credentials. And from there, the attackers can sit and watch the money roll in.
X-Force said they haven’t seen CamuBot “used in other geographies, but that may change over time.”
Page 2: The sounds of hacking. For some time now, the good guys have been touting biometrics as a good way to improve the security of your devices. So it shouldn’t be any surprise that bad guys might be interested in using it to undermine security. Fortunately, an acoustic side-channel attack labeled SonarSnoop, aimed at stealing the unlock patterns from Android devices has been reported by some good guys—researchers from universities in the U.K. and Sweden.
According to the researchers themselves, it’s not an imminent threat. But their demonstration—that they can capture the sound waves generated by a finger sliding across the face of a phone to unlock it and then use those waves to figure out the unlock pattern—is definitely ominous. The attack requires getting a malicious app installed on the device, which broadcasts a sound just beyond the range of human hearing, that then captures the location of nearby objects—in this case, the user’s finger. Right—it’s sonar.
So far it’s a bit crude but promising for those looking to break into your device. The researchers said they were able to reduce the number of possible unlock patterns by 70% using data obtained with SonarSnoop on a Samsung Galaxy S4 smartphone running Android 5.0.1.
This is one more in a depressing list of possible exploits of smartphone sensors. Researchers have demonstrated in the past—using gyroscopes, accelerometers, barometers, and magnetometers—that it is possible to track smartphone users even when they have their location services turned off.
In this case, the researchers credit Finger IO for their inspiration. That concept was demonstrated in March 2016 by researchers who used high-frequency sound waves to pick up finger motions and duplicate them on a smartwatch touchscreen. As they put it, in typical academic language, “Our work highlights a new family of security threats.” Not the kind of neighbors we need.
Page 3: Back to the backdoor. The battle between tech companies and law enforcement over encryption has been going on for some time now. Law enforcement officials say they support it, and have respect for personal privacy—except when it comes to criminals and terrorists. They say they can’t do their jobs and keep us all safe if they are blinded by encryption. Two FBI directors—former Director James Comey and the incumbent, Christopher Wray—have spoken about efforts to preserve public safety “going dark” due to unbreakable encryption.
And the response from encryption experts has been the same all along: You can’t make a backdoor just for the good guys. The bad guys will get it too. Encryption guru Bruce Schneier is just one of many who has said multiple times that “it is absurd to think that encryption can work well unless there is a certain piece of paper (a warrant) sitting nearby, in which case it should not work.”
“Mathematically, of course, this is ridiculous,” he said. “You don’t get an option where the FBI can break encryption but organized crime can’t. It’s not available technologically.”
But now an alliance of intelligence agencies in five countries—the U.S., the U.K., Australia, Canada, and New Zealand—known as the Five Eyes, appear to be ready to test that proposition. After a meeting in August, the group published a Statement of Principles on Access to Evidence and Encryption, which makes reference to the “voluntary cooperation of industry partners.”
It claims to support encryption, declaring that it is “vital to the digital economy and a secure cyberspace, and to the protection of personal, commercial and government information.” It claims that “the five countries have no interest or intention to weaken encryption mechanisms.” Except they do.
Other parts of the document make it clear the “industry partners” won’t have any choice about “volunteering.” In the closing lines of the statement, the agencies say that if they “encounter impediments” to the information they want, “we may pursue technological, enforcement, legislative or other measures to achieve lawful access solutions.”
Which prompted a blistering post from journalist, blogger, and privacy advocate Cory Doctorow, who said the agencies pushing for this don’t understand encryption if they think it can function the way they want. “Use deliberately compromised cryptography, that has a back door that only the ‘good guys’ are supposed to have the keys to, and you have effectively no security,” he wrote.
But given that governments have much more power than activists, we may be about to find out.
And that’s it for this week. The Weekly Security Mashup is a group effort, so thanks again to our entire content team, and thanks to you for watching. Helps us to spread the word. Tweet it, link it, share it, like it, and come back again next week. I’m Taylor Armerding for the Synopsys Software Integrity Group, where we help organizations build secure, high-quality software faster.