REVOLTE ATTACK

New attack allows hackers to decrypt VoLTE encryption to spy on phone calls

A team of academic researchers - who made headlines earlier this year for severely Security problems in 4G LTE and 5G networks disclosures - today unveiled a new attack called "ReVoLTE" that allows attackers to crack the encryption of VoLTE voice calls and spy on targeted calls.

The attack does not exploit a vulnerability in the Voice over LTE (VoLTE) protocol, but takes advantage of the weak implementation of the LTE mobile network at most telecom providers in practice, allowing an attacker to listen to the encrypted phone calls of the targeted victims.

VoLTE, or Voice over Long Term Evolution Protocol, is a standard for high-speed wireless communications for mobile phones and data terminals, including Internet of Things (IoT) devices and wearables that use 4G LTE wireless technology.

The crux of the matter is that most mobile operators often use the same keystream for two consecutive calls within a radio link to encrypt the voice data between the phone and the same base station, i.e. the cell tower.

The new ReVoLTE attack thus exploits the reuse of the same keystream by vulnerable base stations, allowing attackers to decrypt the content of VoLTE-supported voice calls in the following scenario.

However, the reuse of a predictable keystream is not new and was first demonstrated by Raza & Lu but the ReVoLTE attack makes it a practical attack.

How does the ReVoLTE attack work?

To initiate this attack, the attacker must be connected to the same base station as the victim and place a downlink sniffer to monitor and record a "targeted call" from the victim to another person, which must later be decrypted, as part of the first phase of the ReVoLTE attack.

Once the victim hangs up the "targeted call," the attacker is prompted to call the victim, usually within 10 seconds immediately, which forces the vulnerable network to initiate a new call between the victim and attacker over the same radio link used by the previous targeted call.

"Keystream reuse occurs when the destination and the keystream call use the same user-plane encryption key. Since this key is updated for each new radio connection, the attacker must ensure that the first packet of the keystream call arrives within the active phase after the destination call," the researchers said.

Once the connection is established, the second phase requires the attacker to engage the victim in a conversation and record it in plaintext, which would later help the attacker to reverse compute the keystream used by the subsequent call.

According to the researchers, XORing the keystreams with the corresponding encrypted frame of the target call recorded in the first phase decrypts its content, allowing the attackers to eavesdrop on what conversation their victim had in the previous call.

"Since this results in the same keystream, all RTP data is encrypted in the same way as the destination call voice data. Once a sufficient amount of keystream data has been generated, the attacker aborts the call," the paper says.

However, the length of the second call should be greater than or equal to the first call in order to decrypt each frame; otherwise, it can decrypt only part of the call.

"It is important to note that the attacker must engage the victim in a prolonged conversation. The longer he/she has talked to the victim, the more content of the previous communication he/she can decode," the paper says.

"Each frame is associated with a count and encrypted with an individual keystream, which we extract during keystream computation. Since the same count generates the same keystream, the count synchronizes the keystreams with encrypted frames of the destination call. By XORing the keystreams with the corresponding encrypted frame, the destination call is decrypted."

"Since our goal is to decrypt the entire call, the keystream call must be as long as the target call to deliver a sufficient number of packets, otherwise we can only decrypt a portion of the call."

ReVoLTE attack detection and demonstration

To demonstrate the practical feasibility of the ReVoLTE attack, the team of scientists from Ruhr-Universität Bochum implemented an end-to-end version of the attack within a commercial vulnerable network and commercial phones.

The team used Software Radio System's Airscope downlink analyzer to capture the encrypted traffic and three Android-based phones to retrieve the known plaintext on the attacker's phone. It then compared the two recorded conversations, determined the encryption key, and finally decrypted part of the previous conversation.

You can see the demo video of the ReVoLTE attack, which the researchers say costs less than $7000 to set up and eventually decrypt downlink traffic.

The team tested a number of randomly selected radio cells across Germany to determine the extent of the problem and found that 12 out of 15 base stations in Germany were affected, but the researchers said the vulnerability affects other countries as well.

Researchers notified affected German base station operators of the ReVoLTE attack in early December 2019 as part of the GSMA Coordinated Vulnerability Disclosure Program, and operators were able to deploy the patches at the time of publication.

 

Since the problem also affects a large number of providers worldwide, the researchers have released an open-source Android app called "Mobile Sentinel" that you can use to determine whether or not their 4G network and base stations are vulnerable to the ReVoLTE attack.

The researchers - David Rupprecht, Katharina Kohls and Thorsten Holz from Ruhr University Bochum and Christina Pöpper from NYU Abu Dhabi - have also published a own website and a research paper (PDF) titled "Call Me Maybe: Eavesdropping Encrypted LTE Calls With REVOLTE" that details the ReVoLTE attack, where you can find more details.

Source

11 03 2021

How confidential are your calls?
This iPhone app shares them with everyone

No need to panic.

This is not a case of secret nation-state methods of phone tapping (or spying, as it is often called).

It's not a story of cybercriminals deliberately trying to listen in on your business conversations to divert massive bill payments or implant ransomware with multi-million dollar extortion demands.

That was the good news.

The flaw in this case, discovered by Indian cybersecurity researcher Anand Prakash, was simply a bug of poor programming.

The bad news is that the side effects of the bug could be exploited by anyone, anywhere, anytime.

Who needs authentication?

The type of vulnerability Prakash found often gets the euphonious name IDOR, short for Insecure Direct Object Reference.

An IDOR vulnerability usually consists of a website or service that makes it easy for someone running the app to retrieve data they are supposed to access....

...to figure out how to access other people's data in the future without logging in or even authenticating.
Typically, you'll find that an app or service uses a URL or web form that contains your own user ID, serial number, or other not-very-secret identifier without any other way to be sure it's you. For example, you might try to create a request using someone else's ID, the next number in sequence, or some other likely guess for a valid reference, and find that the system retrieves the data directly for you, even though it's not your record and you shouldn't see it.

In theory, many exploitable IDOR flaws can be found purely analytically by reverse engineering the suspicious app without ever creating a fake account and running the app itself. In practice, it's often easier and faster to do some basic reversals to give you an idea of what to look for and then run the suspicious app while you watch it in action. You don't need to spend days statically analyzing an app in a decompiler when you can infer the bugs directly from its own behavior - you just give the app a chance to cook its own cybersecurity goose while you take notes.

In this case, the app was called "Acr call recorder" - for the iPhone, like many apps in the App Store, it is (or was when we looked) littered with hundreds, even thousands of glowing 5-star reviews.

You can probably guess where this is going, many of these 5-star reviews curiously recommend a completely different app in their text, or praise the app with strange twists that put forth unlikely and even disturbing reasons.

For example, someone named Earnest assures you that "it's definitely a waste if you haven't tried this app," while Christopher.1966 says he's been "using this little thing almost since I got on the train," and Brenda somewhat creepily, if redundantly, expresses her delight that she can now "record what me and my girlfriend said." (A call recorder that couldn't record calls would be plainly misnamed.)

Even though it turns out Brenda is referring to an entirely different app that includes a voice-switching feature, one wonders if Brenda's friend realized who she was talking to when she was recorded. Brenda's 5-star rating still counts towards the attractive average rating of 4.2/5 for the aforementioned call recorder app.

However, there are many 1-star reviews that warn you that this is one of those "free trial apps" that automatically charge you if you don't cancel within three days - a type of free app that Elizafish very succinctly described with her review "FREE ????? Ridiculous."

But perhaps the most apt review, at least until the app was updated after the developer received Anand Prakash's bug report, was Leanne's 5-star review, which said, "Not only can I manage recordings, but I can easily share them when needed. So convenient for me!"

What Leanne left out, however, was that the app's cloud-based storage feature is handy not just for her, but for everyone else in the world, including those who don't own the app or an iPhone.

Sharing her calls with other people was apparently a lot easier than she thought.

Who needs authentication?

Prakash decompiled the app to look for likely URLs it might connect to, monitored the app as it ran, and found that one of its call-home requests was a block of JSON data that looked something like this:

POST /fetch-sinch-recordings.php HTTP/1.1
Host: [REDACTED]
Content-Type: application/json
Connection: close
[. . .more headers, no unique cookies or tokens. . .]

{
"UserID": "xxxxxx",
"AppID": "xxx"
}

Since there is no way to bind this request to a specific user who has already authenticated, and the server has no way of deciding whether the sender of the request even has the right to ask for data belonging to the user designated by UserID...... designated user, someone can insert the UserID of any user into a fake request, and therefore no one's data is safe from anyone. This type of flaw is given the name IDOR because it allows attackers to insecurely and directly designate their victims by simply inserting a new UserID directly into the request.

What am I to do?

Here's our tip.

As a user, don't be swayed by reviews on the App Store or Google Play.

We suggest that you ignore the reviews and star ratings in the app stores. You have no idea who gave those ratings or left the reviews, or if they even used the app.

Fake reviews and official-looking app store ratings can be purchased online at a price that is almost literally ten pennies. Look for reviews on independent user forums or discussions in online cybersecurity groups.

Consider using third-party mobile cybersecurity software to supplement the built-in protection of the device you're using.

Both Apple and Google run their own online stores that contain vetted and approved apps. However, these walled gardens are far from perfect. There are simply too many developers and too many apps for them to be thoroughly vetted by an expert.

As a programmer, follow the operating system's own recommendations for secure coding.

Guidelines from Apple, Google and others on secure programming on their platforms alone are not enough.

However, if there is vendor advice that you have ignored or are not even aware of, your cybersecurity is probably not up to snuff. Therefore, treat the vendor's own guidelines as "necessary but not sufficient", 

Apple, for example, has a wide range of Security advice for programmerswhich cover authentication (is the right person doing the right thing?), confidentiality (is the data safe from snooping when stored or moved across the network?), and validity (does the right code do the right thing?).

Never stop learning and reading about cybersecurity.

One of the reasons we exist is to help you understand and fight cybercrime, and avoid the kind of mistakes that make life easier for crooks.
(We don't see this as a one-way street - we read all the comments and advice you, our readers, leave here, and make sure our developers and product managers hear you too!)

If you're just getting into cloud or web development and want to know what you should learn first, the OWASP Top Ten are probably a good place to start.

Remember that cybersecurity is a journey, not a destination.

12 03 2021

US government tracks how people move in coronavirus pandemic

A man checks his phone in Times Square. Cell phone data used by the government shows which public spaces still draw crowds.

WASHINGTON - Government officials in the U.S. are using location data from millions of cell phones to better understand the movements of Americans during the Coronavirus pandemic pandemic and to better understand their potential impact on the spread of the disease.

The federal government, through the Centers for Disease Control and Prevention and state and local governments, has begun obtaining analyses of the presence and movement of people in certain areas of geographic interest from cellphone data, according to people familiar with the matter. The data comes from the mobile advertising industry rather than wireless carriers.

The goal is to create a portal for federal, state and local officials that will include geolocation data in potentially as many as 500 cities across the U.S., one of the staffers said, to help plan response to the epidemic.

Using the data, which does not identify information such as the name of a phone's owner, officials can learn how the coronavirus is spreading across the country to mitigate its progress. It shows which retail stores, parks and other public areas still draw crowds that could accelerate transmission of the virus. In one such case, researchers noted that New Yorkers were gathering in large numbers in Brooklyn's Prospect Park and passed that information on to local authorities, one person said. Warning signs have been posted in New York City parks, but they have not yet been closed.

The data can also show the overall level of compliance with orders to stay home or stay safe, according to experts inside and outside the government, and help measure the economic impact of the pandemic by showing the decline in customers in stores, the decline in vehicle miles traveled and other economic metrics.

The CDC has begun to receive analysis based on location data through an ad hoc coalition of technology companies and data providers, all working with the White House and other government officials.

The CDC and White House did not respond to requests for comment.
The growing reliance on location data from mobile phones continues to raise privacy concerns, especially when programs are operated or commissioned by governments.

Wolfie Christl, a privacy activist and researcher, said the location data industry "covidwashes" products that typically violate privacy.

"In light of the looming disaster, it may make sense in some cases to use aggregate analytics based on consumer data, even if the data is collected surreptitiously or illegally by companies," Christl said. "Because true anonymization of location data is nearly impossible, strong legal protections are imperative." The protections should limit how the data can be used and ensure that it is not later used for other purposes, he said.

 

Data protectionists fear that even anonymised data could be used in combination with other publicly available information to identify and track individuals.

Some companies in the U.S. location data industry have made their data or analytics available to the public, or made their raw data available to researchers or governments. San Francisco-based LotaData launched a public portal to analyze movement patterns within Italy that could help authorities plan for outbreaks, and plans additional portals for Spain, California, and New York. The company Unacast launched a public "social distancing scoreboard" that uses location data to rate places on how well their populations follow instructions to be at home.

Other state and local governments have also begun commissioning their own studies and analysis from private companies. Foursquare Labs Inc, one of the largest providers of location data said it is in talks with numerous state and local governments about the use of its data.

Researchers and governments around the world have used a patchwork of authorities and tactics to collect cell phone data - sometimes hoping for voluntary consent from companies or individuals, in other cases using laws meant for terrorism or other emergencies to collect vast amounts of data on citizens to combat the threat of the coronavirus.

Researchers at the Massachusetts Institute of Technology have launched a Project started, to track Covid 19 volunteers via a mobile phone app. Telecommunications companies in Germany, Austria, Spain, Belgium, the United Kingdom and other countries have Data passed on to authorities, to help fight the pandemic. Israel's Intelligence agencies have been tapped, to use anti-terror phone tracking technology to map infections.

In the US, most of the data used to date has come from the advertising industry. The mobile marketing industry has billions of geographic data points on hundreds of millions of U.S. mobile devices - mainly from applications that users have installed on their phones and whose location may be tracked. Huge amounts of this advertising data are for sale.

The industry is largely unregulated under existing privacy laws because consumers have consented to tracking and because the data does not include names or addresses - each consumer is represented by an alphanumeric string.

Wireless carriers also have access to massive amounts of geolocation data, which under U.S. law enjoys much stricter privacy protections than in most other countries. The largest U.S. carriers including AT&T Inc. and Verizon Communications Inc. say they have not been approached by the government to provide location data, according to spokespeople. There have been discussions about trying to obtain U.S. telecommunications data for this purpose, but the legality of such a move is not clear.

Source