Leak from an unexpected direction: Fitness network "Strava"

02/02/2018
G DATA Blog

For many people, fitness trackers are a part of their everyday life. They are supposed to help us to work out more and to live healthier. When such devices can be abused to map out military installations, though, they are a big problem.

Our little fitness helpers generate a plethora of data: how much you have worked out, how many calories you burned and which route you cycled or where you did your morning run. All this data can also be uploaded to an online platform to document and share your own progress. Members of the military also want to maintain and improve their fitness and their workout routine, especially if they happen to be deployed at a remote outpost far away from anything. Running around the compound as part of a workout routine therefore seems a logical thing to do.

Data reconciliation

Back in November 2017, the "Strava" platform published a heatmap generated from the data of its users. It was intended to visualize how much the app is used and where users go running or cycling. Streets are clearly visible in the map. The activity is strongest in densely populated areas, which was to be expected. However, if you move into areas with less activity and more sparsely populated areas in some problematic Regions of the world, you will be able to spot some bright spots every now and then. Those bright spots in the middle of nowhere are the problem: they are often military bases, where Strava users generate data.

Locations and movements

The maps are remarkably precise. In some of them you can clearly make out the layout of a base and can even distinguish individual routes taken by the users. An analyst stated that this data is "not amazing for operational security", which is clearly an understatement. Any armed force is usually very tight-lipped when it comes to force levels, movements, locations and layouts of military installations. This makes the fact that you can even make out less frequented areas within an installation quite problematic and military commanders are very likely to be less than thrilled about the findings. 

Requesting permission

Of course, many apps can be configured not to share specific data with a cloud platform. That is, if the settings are configured in the first place. This has not happened in a number of cases. The consequence of this skip-up will probably be a ban of fitness trackers in certain areas. 

These considerations does not only make sense for members of the military: both businesses as well as individuals might have little interest to share informations about their movements and their whereabouts. Anyone using a fitness tracker or an app which serves the same function should carefully consider if and what data to share. This is especially true if the data is saved and processed on a web-based platform. In this case, potentially confidential data was made public without any malicious intent. This possibility must be taken into consideration. 

Privacy and data leaks

Should a platform be compromised by a malicious actor, then there is potentially a lot more at stake than just one's own movements. The discussion around fitness trackers and their susceptibility to attacks comes up regularly. Some interesting questions have also been raised already about privacy: some health insurances offer discounted rates to customers on the condition that they share the data of their fitness trackers with the insurance company.

Devices like fitness trackers, smart watches and others need to be seen for what they are: souces of data, the activity of which is not always evident. The devices and the apps they connect to will keep collecting data even if either of the two is not used actively at a given moment and is located in a table or in a bag. Every user should therefore check carefully which data is connected, especially when this data may include location data. Depending on the operating system and the configuration of certain services, movements can be traced back sometimes years into the past. This thought surely will make a lot of people feel quite uncomfortable. 

What to do

Apps and devices with an appetite for data are becoming increasingly problematic, as many data points can be abused. Many apps have a poor track record in terms of their security. This includes, but is not limited to health and fitness apps as well as apps used for online banking. Even if certain individual pieces of information are not critical, they can paint a very clear picture when put into context. For a sufficiently motivated attacker it is quite possible to collect such data. Not taking care of this issue makes it easier for them than it needs to be.
Review your device and app settings and ask yourself the following questions:

  • Which data is collected?
  • Is the data vital for the app's / the device's function?
  • What do you use the device for?
  • Is there any comapny data stored or processed on a given device? This includes emails as well as documents. Businesses should have a very clear set of rules for the use of privately owned devices for business purposes.
  • Do you enter any restricted or secured areas with your device?

In most current mobile operating systems you can deny access to specific data such as location data, camera roll, contacts etc. for an individual app.Evaluate whether an app should be able to access the data at all, based on the questions above. For business-only devices, the rule should be "trust, but verify", so confidential data has a better chance of remaining confident. Administrators should consider only allowing a set of sanctioned apps and to use a Mobile Device Management to secure and administer the devices.