‘Informed’ consent in popular location based services and digital sovereignty

ABSTRACT In many countries, informed consent is required before a service provider can collect personal data from a user. For location-based services (LBS), this applies in particular to personal location information, which can enable deep inferences about a person. In this paper, we present a systematic analysis of how informed consent for the collection of personal location information is obtained in 40 popular LBS on each of the two largest app stores. Two independent raters assessed the content, structure and design of the dialogues shown by apps to obtain consent from users. Based on their assessment, we identified common approaches used across and within different app categories and platforms, including the frequent use of ‘dark patterns’. We highlight key issues arising from these common designs, discuss specific gaps in the procedure of obtaining informed consent and propose improvements to that procedure. In addition, we consider current practice in the context of enabling digital sovereignty with respect to personal location information. Our findings can shape the design and evaluation of informed consent procedures for future LBS in research and practice.


Introduction
A key requirement for the provision of location-based services (LBS) is access to location data, in particular to the current location of an LBS user. This is needed to adapt the service to a specific location, for example, to provide a list of nearby restaurants. In addition, personal location information (PLI) such as the user's current location is also an economic asset that many supposedly 'free' LBS take as hidden payment (Poikela and Toch 2017). The value (and danger) of PLI results from its potential for deep and fast inferences about users, their relationships and their behaviour. Since a location has an inherent meaning, it is relatively easy to infer, for example, that a user is probably getting medical treatment from the fact that they spend time at a coordinate where a hospital is located. Similarly, it is possible to find out more about religious orientation or nightlife activities (Drakonakis et al. 2019). While service providers thus have strong incentives to collect location information, users might want to carefully control whom they share their location information with as well as when and where they do this.
Due to the sensitive nature of PLI, in many countries laws now require explicit and informed consent by users before their location data can be collected. Arguably, truly informed consent is a key ingredient of (digital) sovereignty as well: if a person does not understand what they are agreeing to, they are unable to exert their own will or assess what is in their best interest. In addition, power over individuals is granted to other parties without the individuals being fully aware of this and without them being able to take control of the situation. In Europe, the General Data Protection Regulation (GDPR) 1 applies to all personal data including PLI. Informed consent is one core element of this regulation, among others such as the right to access data that is collected about oneself or the right to have one's data deleted.
In practical terms, this has a number of noticeable effects. For example, websites now have to explicitly ask before they can track users, e.g. via cookies or other technologies. This usually means that upon first visiting a site, users are presented with mandatory, non-standardised and sometimes very complex dialogues before they can access the content (Utz et al. 2019). In locationbased services, the situation is somewhat different: while there are also similar dialogues, the underlying mobile operating systems also require explicit consent in a standardised way before apps can collect location data. In contrast to typical websites, LBS usually cannot provide their services unless they collect location information.
Regarding user interface design for informed consent, LBS employ different approaches in order to obtain permission from users to access location information. While the user interface for location tracking is standardised for the two major mobile operating systems, some apps place custom and sometimes intrusive dialogues before the actual system dialogue (see Figure 1). Certain user interface (UI) designs and user experience (UX) patterns can mislead users to perform actions that are not in line with their preferences.
In the light of the strong dependency on personal location information, this paper investigates current practice of popular location-based services in dealing with informed consent as it pertains to PLI. Besides exploring different ways in which informed consent is sought, we were particularly interested in how these practices affect the digital sovereignty of LBS users. In order to shed light on these issues, we carried out a systematic analysis of 40 popular location-based services available on the Google Play Store as well as the App Store from Apple. Two independent raters assessed each app regarding the content, structure and design of its informed consent dialogues. In addition to the analysis framework, our key contributions are the following insights: we identified a number of common patterns and approaches and then analysed them in terms of their impact on digital sovereignty. We found substantial differences between app categories and minor ones between operating systems regarding structure, content and design of the informed consent dialogue. The ease of access to control mechanisms also varied considerably across apps. In addition, dark patterns occurred in a significant number of cases -these are generally deceptive UI/UX designs (see section 2.2). Therefore, enabling a high degree of digital sovereignty in this context will require changes in several areas. Data and analysis procedures are referenced or included in the appendix to facilitate reuse as well as inspection and reproduction of our results (see Table A2). A repository with our raw data is referenced in the data availability statement in the end of this article. The remainder of the paper first briefly reviews related work and then explains the methodology that we developed in detail to assess locationbased services with respect to informed consent. Subsequently, we highlight key results and discuss them as well as their implications for digital sovereignty. The paper concludes with a short summary of the main findings.

Related work
In this section, we will highlight related work from the field, namely informed consent, dark patterns, research regarding location-based services and location privacy as well as studies that analyse mobile LBS from a privacy perspective. We also discuss and define digital sovereignty.

Informed consent
Since informed consent to the access to personal information is required by regulations like the GDPR (Tsohou and Kosta 2017), the topic naturally also received attention in context of mobile apps and LBS research. Bu-Pasha et al. (2016) pointed out, that in the EU informed consent to data collection was even required before the GDPR came into effect. This includes the information about the purpose of the data usage and with whom this data is shared (Bu-Pasha et al. 2016). Taking location data as an example, based on the GDPR, the authors claimed that users should be able to withdraw consent in a convenient way and delete past location data. They further demanded 'specific legal standards for the location data privacy in smartphones' (Bu-Pasha et al. 2016). Tsohou and Kosta (2017) provide a legal perspective and study the user perspective on informed consent for location tracking in mobile apps. Based on semistructured interviews, where they observed 15 participants while installing mobile apps that access their users' location with and without guidance for reading the privacy policy, they created process models for giving informed consent. Those structured process models cover the privacy policy reading process, privacy awareness process and informed consent process. Their study revealed that even if users understand a privacy policy, they often are unable to translate it into specific risks for their privacy. Furthermore, users are often not aware of the privacy terms when they accept them, and mobile apps do not use appropriate language in their privacy policies (Tsohou and Kosta 2017). Another recent phenomenon in the digital world has also received attention: cookie banners. Utz et al. (2019) conducted a large-scale online field study to investigate the influence of the position of consent notices, what kind of choice is presented and how the content is framed. Apart from the fact that users tend to give more consent if they only have a binary choice (in comparison to explicit consent to multiple features or companies), nudging is often applied in the consent process and the authors conclude, that in reality customers do not have a meaningful choice (Utz et al. 2019).
While some user interfaces for informed consent, like cookie banners (Utz et al. 2019), have been researched in very detailed ways, there is a lack of research on mobile user interfaces for informed consent in the context of LBS. Although Tsohou and Kosta (2017) looked at informed consent as part of the installation process of an app (including the reading of privacy policies), they did not explicitly look at the actual user interfaces involved for requesting this consent such as the specific dialogues that were used. We argue that the analysis of UI and UX used for obtaining consent for accessing PLI could provide additional insights and reveal further gaps in the design of LBS.

Dark patterns
The topic of 'dark' UI patterns emerged during the last years. Drawing from observations from practice, Harry Brignull first coined the term 'dark pattern' in 2010. On his website 'www.darkpatterns.org' 2 he defined dark patterns as 'tricks used in websites and apps that make you do things that you didn't mean to, like buying or signing up for something.' and gave a number of examples from different categories found in practice. This concept was then picked up by the research community. For example, Gray et al. (2018) took Brignull's original categories and defined five categories based on a corpus analysis of cases of dark patterns found in practice on the internet: Nagging: 'Redirection of expected functionality that persists beyond one or more interactions.' Obstruction: 'Making a process more difficult than it needs to be, with the intent of dissuading certain action(s).' Sneaking: 'Attempting to hide, disguise, or delay the divulging of information that is relevant to the user.' Interface interference: 'Manipulation of the user interface that privileges certain actions over others.' Forced action: 'Requiring the user to perform a certain action to access (or continue to access) certain functionality.' Applying the original categories of Brignull and the refined version from Gray et al. (2018), the recent study of Di Geronimo et al. (2020) specifically looked at mobile apps and analysed them for the occurrences of dark patterns. Motivated by a lack of research on the topic of dark patterns in mobile apps and their perception by users, the authors studied the prominence of dark patterns in mobile apps and the user awareness of those. Their analysis corpus consists of 240 mobile apps from the Android store from eight categories (30 of each category) that was then analysed by two researchers for the presence of dark patterns. The results revealed that, on average, every application contains seven dark patterns. To assess the users' perception of those dark patterns, Di Geronimo et al. (2020) then conducted an online experiment with showing users short videos of the app usage and let them fill out a questionnaire to assess if they recognised dark patterns. This study has shown that users do often not recognise dark patterns in mobile apps (Di Geronimo et al. 2020).
Similar to Di Geronimo et al. (2020), we use the same dark pattern categories that were created in the work of Gray et al. (2018) but go one step further: instead of searching for dark patterns in the general context of mobile apps, we look at informed consent for location access in location-based services.

Location based services and location privacy
As already mentioned earlier, location data can reveal sensitive additional information like, for example, the user's home (Krumm 2007;Drakonakis et al. 2019) or work location as well as information about their health status, religion or nightlife (Drakonakis et al. 2019). Hence, the users' privacy regarding location information might be harmed. Bargiotti et al. (2016) defined location privacy as the 'individual's right not to be subjected to unauthorised collection, aggregation, processing, and distribution (including selling) of his location data. It is the right to be protected by the ability to conceal information of whereabouts.' Duckham and Kulik (2006) based their definition of location privacy on Alan Westin's definition of 'information privacy'. According to them, 'location privacy can be defined as a special type of information privacy that concerns the claim of individuals to determine for themselves when, how, and to what extent location information about them is communicated to others. In short, control of location information is the central issue in location privacy' (Duckham and Kulik 2006). While the latter definition does not include processing of location data, it highlights the importance of control of location information, something that is strongly related to informed consent, the central topic of our work. In recent years, location privacy has gained popularity in research which is also reflected in a number of survey papers that give an overview over basic concepts. In Görlach, Heinemann, and Terspstra (2005) named attack mechanisms and protection mechanisms based on technology or regulations as central aspects. Duckham and Kulik (2006) highlighted privacy related characteristics of different location sensing technologies and named regulations, privacy policies, anonymity and obfuscation as ways of protecting location privacy. Krumm (2009) identified three main strands of work: attack mechanisms (or 'computational threats'), defence mechanisms (or 'computational countermeasures') and work that focuses on 'peoples attitudes'. Those basic themes can also be found in the work of Chatzikokolakis et al. (2017) with a categorisation of (technical) threats, the user's perspective and different (technical) methods of protecting location privacy. Wernke et al. (2014) distinguished between algorithmic attacks on location privacy and approaches to protect the same, whereas Bettini (2018) named threats, regulations, user privacy preferences and technical privacy protection mechanisms. While giving an overview of viable contributions to the research field of location privacy, the main concepts identified from the presented literature reviews also indicate a potential lack of more practical studies.

Privacy related mobile LBS analysis
Since mobile apps can potentially have access to very sensitive information (like location, health data, address book etc.), analysing them from a privacy standpoint has resulted in a number of studies. Some studies specifically looked at apps that access the user's location and pure location-based services. Techniques such as automatic analysis of Android's manifest file (Liu, Gao, and Wang 2017), manual source code analysis (Alhamed et al. 2013) and logging of location access via custom software (Fawaz, Feng, and Shin 2015) were applied. Liu, Gao, and Wang (2017) downloaded the top 100 apps from each of Android's 28 app store categories and automatically filtered for those that access the user's location. They aimed to find out more about the risk of location background tracking in contrast to apps accessing the user's location while being in foreground. In a manual process, they installed 1,140 applications by hand on a device and checked if the location is accessed in the background. In addition, they measured the risk of tracking for identification of points of interest (POIs) by analysing the sampling frequency. Their findings suggested that many popular applications access location in the background and the risk of identifying POIs of users is higher due to a higher frequency when their location is constantly tracked in the background (Liu, Gao, and Wang 2017). In the context of location access of mobile apps, Fawaz, Feng, and KG Shin (2015) conducted an analysis of over 1,000 free apps from the Google Play store manually and checked if the location was requested to deliver the app's core functionality or by an analysis and advertisement framework. Based on a more detailed usage analysis of 400 apps and 70 analysis and advertisement libraries with over 100 participants, the authors showed that location access can pose a severe profiling threat and that Android's control mechanisms for location privacy were not sufficient at the time of the study (Fawaz, Feng, and Shin 2015). Alhamed et al. (2013) took a more general approach and analysed the privacy control methods of Android and iOS that are part of the location service API. They aimed to identify shortcomings related to location access and requests. One of the identified shortcomings was the lack of information about the purpose of accessing location data (Alhamed et al. 2013). Since the study was conducted in 2013, their results might be outdated by now.
In order to provide a wider context to our work, we also want to briefly point out some studies in the field of privacy research that analysed mobile apps in general. A strong focus has been put on automatic code or manifest file analysis of Android applications (Lin et al. 2014;Watanabe et al. 2015;Hatamian et al. 2019;Zimmeck et al. 2019), with sometimes explicitly involving users (Lin et al. 2014). Being the main source of information about how one's data is used, privacy policies of mobile apps and related textual information on privacy have been analysed in automatic (Zimmeck et al. 2019;Watanabe et al. 2015) or manual (Hatamian et al. 2019) ways. While studies analysing mobile apps from a privacy perspective have mainly focused on programmatic aspects and privacy policies, whereas analysing larger number of mobile apps for their privacyrelated behaviour on UI/UX level (like Di Geronimo et al. 2020's work on dark patterns) has received less attention. Therefore, our research is complementary to the existing studies and tries to fill this gap, at least for location-based services.

Digital sovereignty
In recent years, the concept of digital sovereignty appeared in the public discourse and research. Although widely used, there seems to be only a fragmented and blurred understanding of this notion (Hummel et al. 2021). The literature review of Couture and Toupin (2019) on the topic of sovereignty in the digital world resulted in five different definitions. One of them is 'personal digital sovereignty', where control over one's personal data is a central concept. Similarly, Hummel et al. (2021) conducted a recent meta-research on the notion of data (also: digital or cyber) sovereignty. Their findings suggest that the understanding of this emerging concept is very diverse and potentially fragmented. One of the central values they identified is 'control and power' (Hummel et al. 2021). Among other topics, control over data seems to be very prominent. Both works showed that the term 'digital sovereignty' is widely used but its meaning is less clear. In the context of this work, we build upon the definition of Misterek (2017). He defined an extended concept of digital sovereignty on a society level and implied that digital sovereignty can only be realised, when the society takes part in the organisation of digitalisation (Misterek 2017). We claim that this is only possible when users are informed properly in the moment of data collection and have suitable tools and control mechanisms to exert digital sovereignty over their data. Hence, we understand information and control as central aspects of digital sovereignty.
As we have seen, informed consent is an important aspect of (location) privacy. Although this aspect has been researched for cookie banners (Utz et al. 2019) and for consent as part of installation procedure of mobile LBS (Tsohou and Kosta 2017), it is also important to look at general UI/UX features during runtime of a LBS. Similarly, dark patterns have been researched for mobile apps in general (Di Geronimo et al. 2020) and a number of large-scale studies on privacy in commercial LBS has been conducted (Alhamed et al. 2013;Liu, Gao, and Wang 2017) but we lack a more detailed analysis of UI and UX for informed consent in LBS. We argue that informed consent, as one means of control over one's own data, is also a crucial aspect of digital sovereignty. Hence, we conducted a study specifically focused on informed consent for location access in commercial LBS.

Methodology
The goal of this study was to understand how consent regarding location information is handled in apps that access their users' location. In this process, we were interested in how user consent is obtained, whether and in what way this consent can be considered 'informed' and what options users have in terms of managing their PLI. Given the prevalence of dark patterns in mobile apps and the growing concern over a lack of digital sovereignty, we wanted to develop a method that is easy to apply and that factors in the actual user interface and experience during the process of giving consent. To generate comparable and comprehensive insights as well as to create the possibility for other researchers to replicate the procedure in the future, a structured approach was used. In this section, we provide methodological details on how we analysed apps in terms of their handling of informed consent for accessing personal location information.

Analysis framework
To address the issue in an ordered, comprehensible and reproducible way, we created an analysis framework that enables evaluation of aspects relevant to the handling of PLI in apps. The analysis framework consists of the components shown in Table 1.

Procedure
The applied procedure was partly inspired by the analysis of mobile apps for dark patterns that has been conducted by Di Geronimo et al. (2020). The analysis subjecting 80 apps was conducted in an active process where two researchers independently analysed each app, inspecting the same 40 apps per platform (Android 11 and iOS 14.3). Every app was newly installed, thus has never been opened on the test devices prior to being analysed and was then used for roughly 30 minutes, depending on the complexity of the app. Similar to Di Geronimo et al. (2020), the researchers performed common tasks in order to reach usual goals (e.g. registration-/login-process, using the app for its intended primary purpose, visiting the settings page, etc.) and observed how location privacy is handled in those situations according to the analysis framework. This potentially limits mechanisms and effects that are hidden and only occur, for example, after a longer period of usage or by unlocking features (e.g. via in-app-

What
Description How is location access requested by apps? Form of interaction When requesting access to PLI, apps come up with various solutions that can be roughly divided into three forms: system dialogue, custom dialogue or full-screen view* (see Figure 2). What form of interaction is presented to the user? Amount of information When requesting location access, the amount of information given as to why, when and how location is used and processed varies between apps. What amount of information is included when the consent is requested? (none, low, medium or high*) Timing While some apps request the access to PLI upfront on first usage, others rather delay that request to the point where location data is actually needed. Is the request for consent presented upfront on first start of the app? (yes or no) Repetition When denying/withdrawing an app's access to PLI, the request is sometimes repeated in order to (re)gain said access. This repetition sometimes happens multiple times (possibly regularly; e.g. on every start of an app) until the user's consent is given. Is this request to location access repeated if initially declined? (yes or no) Intrusiveness Some apps repeat the location access request in an intrusive way, meaning the user is blocked in using the app without answering the request. E.g. a dialogue page is intrusive as it blocks the rest of the UI and action is needed before a user can proceed with their intent. Is the repeated request presented in an intrusive way? (yes or no) How do the apps deal with the GDPR requirements on a UI level? Accessibility of information GDPR requires disclosure of privacy information which are found in various places within apps. This aspect focuses on how accessible the information is -e.g. highly visible on the top-level view of settings or in a deep navigation path of FAQ-articles.
How accessible are legally required information? (none, low, medium or high*) Accessibility of features Alongside information the GDPR requires a set of features, that are mostly an implementation of the defined user rights. Especially the rights to access, export and deletion of data is analysed in apps, as they are common built-in features. How accessible are legally required features? (none, low, medium or high*) What information about location data privacy is disclosed by apps? Amount On top of the mandatory information that is required by law, apps sometimes disclose more detailed or comprehensible background information (e.g. other than just legal jargon) on how/why/when the user's PLI is processed. As these details are not mandatory, their amount in apps varies. What amount of additional location privacy information is given? (none, low, medium or high*) Accessibility Alongside the amount of such details, the accessibility is relevant. This aspect focuses on how accessible the information is -e.g. highly visible on the top-level view of settings or contrary within a deep navigation structure of FAQ-articles. How accessible are additional location privacy information? (none, low, medium or high*) What location privacy related actions can the user herself take? Amount On top of the mandatory features, apps sometimes disclose more control options to the user regarding their PLI. This aspect focuses on whether the user has a set of tools/options to control her location data available or not. What amount of additional location privacy features is given? (none, low, medium or high*) Accessibility Alongside the amount of such features, their accessibility is relevant. This aspect focuses on how reachable the features are -e.g. control options could be highly accessible with an obvious navigation path (top-level section 'privacy'). How accessible are additional location privacy features? (none, low, medium or high*) Are there dark patterns (Gray et al. 2018) regarding PLI? (present or not present) Nagging 'Redirection of expected functionality that persists beyond one or more interactions.' Interface Interference 'Manipulation of the user interface that privileges certain actions over others, thereby confusing the user.' Forced Action 'Requiring the user to perform a certain action to access (or continue to access) certain functionality.' Sneaking 'Attempting to hide, disguise, or delay the divulging of information that is relevant to the user.' Obstruction 'Making a process more difficult than it needs to be, with the intent of dissuading certain action(s).' *For more information on the analysis parameters, see Table A1 in the Appendix. purchases). The analysis was conducted from January until March 2021. Due to a temporal gap of approx. two months between both analyses, some of the analysed apps were updated by their providers in the meantime -this is not an issue for Android devices as old app versions are mostly retrievable, but on Apple's operating system this caused minor deviations in the analysed app versions. While no significant deviation in the handling of the apps was noticed, this still could have a minor impact on the results. As the analysis-framework relies at least partly on subjective interpretations, the two raters did informally discuss and thus align their schemes in order to synchronise the applied ratings. Besides analysing with the stated framework, the researchers collected some metadata including the name, providing developer/company, exact version numbers, Android package files and several screenshots of key aspects of the apps tested to be able to reference findings subsequently. All results of the examinations were recorded in a rating matrix in which each aspect of the analysis framework was rated and optionally provided with a small commentary to support the rating.

Inter-rater reliability
In order to identify the level of agreement between both raters, the results were assessed using a Cohen's kappa coefficient (Cohen 1960). This statistical technique is used to calculate the inter-rater reliability between two persons that rate qualitative items using the same scheme and corrects for how often the agreements occur by chance. The original Cohen's kappa coefficient takes the disagreement between the raters into account, but does not measure its degree -all disagreement is treated equally as total disagreement. Therefore, a weighted kappa is preferred in our case, where different levels of agreement can contribute to the value of the inter-rater reliability (Cohen 1968). Larger deviations of the raters from each other carry more weight than smaller deviations. The weighted kappa coefficient of both app analyses using linear weights equals to 0.69. This strength of agreement is considered to be substantial, thus verifying a resilient inter-rater reliability (Regier et al. 2013).

App selection
To find out how informed consent regarding PLI is currently managed in practice, we took a look at widely used popular apps across the app stores. The analysed apps were selected to roughly represent the current app market and ultimately to understand how location privacy is usually handled in apps nowadays. The researchers are based in Germany, which potentially influenced the selection of apps, as the respective app stores, their content and form of presentation might differ region-wise. Generally, all audited apps are free of charge and were used without any purchases made, though some apps offer inapp-purchases or subscriptions. Therefore, the selection of apps consists of free apps that are popular amongst the German audience. Multiple factors were taken into account when deciding which apps are relevant for this analysis: • placement in charts/top-lists within the respective app store (e.g. general or category-wise lists) • download numbers (at least over 1 million; this is only published on Android) • awards given by the respective app store that honour successful apps (for example 'App of the year' in Apple's ecosystem) • usage of location data (as the handling of PLI is of our interest) These factors were considered equally on both operating systems, which enables popular apps to be identified across operating system boundaries and comparability to be achieved. While we looked at both platforms for consistent and complete results, potential differences between operating systems were not systematically analysed. Any differences we noted during the analysis, are briefly noted later on. The app stores on both platforms (iOS: App Store, Android: Google Play Store) already consist of a number of various categories (e.g. more than 25 on the Apple App Store). To be able to create a distinctive variety of currently used apps while keeping a focus on LBS, the multitude of categories of the app stores was coarsely broken down into the following eight categories. These categories were used to guide the app selection while taking apps with various use-cases into account -thus five apps out of each of the eight categories were selected for the analysis. This also has the effect of merging differing categories from both platforms. Some categories consist of apps that make active use of location data and can be identified as, strictly speaking, 'location based services' (LBS+) -the categories Maps/Navigation and Activity Tracking exclusively contain LBS+, while other categories mostly contain apps that use location information as a non-essential feature (LBS-) and only very occasionally include LBS+ (see Table A2 in the Appendix).
The current app selection provides a good cross-section of what kind of apps are popular and broadly in use nowadays, but obviously constitutes a temporary snapshot. Due to continuous developments and shifts in the app stores, app popularity and availability is subject to change over the course of time.

Results
In the following paragraphs, we highlight the main findings of our analysis. We first report on some general observations and then highlight how information about PLI collection and use is presented. This is followed by key results on how users can control data collection and use, on how access to PLI is requested by a LBS as well as on what dark patterns we found. Marked differences between operating systems as well as compliance with the GDPR are mentioned where appropriate. Please note that GDPR compliance was not assessed through a legal analysis but rather informally.

General observations
As discussed in the previous section, we distinguish between location-based services that strongly depend on having access to location information (LBS+) and LBS where location information is beneficial but not necessary (LBS-). We generally observed that either type of LBS relies either on a custom dialogues or on a system dialogue as required by the operation system. The former occurred either in a full-screen view or cover only part of the screen. During analysis we took screenshots not only of location access dialogues but also other parts of the apps (see Figure 2). For iOS, rules exist regarding when these dialogues have to appear 4 . In addition, these official guidelines require developers to include a sentence on how and why data is collected. Android guidelines do not prescribe either one. Features related to location privacy either cover the functions prescribed by the GDPR or go beyond it by offering additional or more finegrained controls.

Accessing relevant information
An essential first step in deciding whether and how to share PLI is to access information that is relevant to that decision. This includes, for example, learning more about who is collecting the data, for what purpose and how it is being processed. Much of this fundamental information mandated by the GDPR is provided in form of privacy policies and was readily accessible in the vast majority of LBS (Rater A: 96%/Rater B: 86%, see Figure 3); some apps hid this information in less accessible or complex navigation structures (4%/14%). Additional information beyond the legal requirements was present in 24%/ 12% of all analysed apps. This includes, for example, more in-depth details regarding the processing of PLI, but also more comprehensible explanationsin contrast to prescribed legal clauses -of information that is already contained in privacy policies. Amongst other places, such information could be found in FAQs or in annotations within a settings-section of an app. There was marked difference with respect to accessing additional information depending on how relevant location is for an app. For LBS+, this information was easily accessible in 76%/100% of the analysed cases, i.e. it was reachable with a few interactions and could be found within an obvious path of navigation (e.g. a top-level entry within the settings view). For LBS-, however, additional privacy information (if available) was often hidden inside several sub-menus, scattered across multiple views and/or generally hard to find (65%/13% of those apps).

Controlling location disclosure
A second key component of digital sovereignty regarding PLI is relaying to an LBS how one wishes to share such information, for what purpose and with whom. This includes, for example, specifying whether location information should be shared or requesting the removal of all collected PLI. In analogy to information access, many fundamental control features are defined as mandatory in the GDPR. However, in sharp contrast, access to those required control features was generally difficult (in 52%/65% of all apps, as seen in Figure 4): control features were not easily accessible, hidden in sub-menus rather than prominently displayed and sometimes not at all available within the app. For example, some LBS presented access to such features within the usual settings-structure in the app, while others required users to contact the service provider via email to obtain the PLI that was collected about them or to delete all collected information -thus being a lot less accessible to users.
Similar to what we observed in the case of information access, control features beyond those required by the GDPR were only present in a small number of apps (29%/15%, as seen in Figure 5a). For example 'exclusion zones' represent a location privacy feature that goes beyond legally required ones and has been frequently observed throughout the analysis process. These are optional, user-defined areas, in which no PLI is recorded or -if recordedthis data remains private. Unlike access to additional information, additional location privacy features were more easily reachable and accessible than legally required privacy features for most apps that included them (92%/100%, as seen in Figure 5b). We again observed a marked disparity between different types of LBS: 82%/20% of LBS+ presented such control features very prominently within the apps, while this was only true for 20%/7% of LBS-.

Location access request
In almost all cases, users did not actively initiate the sharing of PLI but rather, the LBS prompts them to obtain access to their PLI. Since users were then asked to make a decision, ideally they should have all information and means readily available to do so in an informed and sovereign way. However, when initially requesting location access, the vast majority of LBS (94%/96%, see Figure 7) provided no or very little in formation to the user regarding why access to PLI is needed and how the data is processed. This might be partially due to the fact that more than half of all apps relied exclusively on the (mandatory) system dialogue that comes with little to no information. While iOS requires developers to include an explanatory sentence (see Figure 6), this is not true for Android. Though most LBS work without having access to location information, more than 42%/40% of all apps requested access when first started rather than when it is actually needed. This applied in  particular to LBS+ (83%/80%). When access is denied, a third of the LBS will continue to repeatedly request access, for example, whenever the app is started again. We classified more than 81%/65% of these repeated access requests as intrusive, i.e. they interfered with the user's main activity and required an explicit interaction with the request dialogue to dismiss it.

Dark patterns
As explained in section 2, dark patterns are user interface designs that manipulate users' behaviour in specific ways and that interfere with the users' intentions, sometimes without them noticing and often to their disadvantage. Particularly in the context of disclosing personal (location) information, dark patterns pose a substantial threat to users' privacy and sovereignty. Unfortunately, our analysis identified a number of well-known dark patterns that occurred at different frequencies (see Figure 8).
The pattern 'obstruction' occurred very often (93%/94%). For example, while it was always very easy to grant a service access to PLI, revoking access was much more difficult: it usually required users to leave the LBS and navigate the settings of the underlying operation system to deactivate location sharing for a specific LBS. We also frequently observed 'nagging' (28%/19%) and 'forced action' (20%/16%), whereas 'interface interference' (4%/9%) and 'sneaking' (6%/1%) were only used occasionally in the surveyed LBS. 'Nagging' and 'forced action' occurred, for example, when apps persistently requested access to PLI upon every start or only allowed the completion of an action in case a user agreed to grant access. 'Interface interference' was sometimes noticeable, for example, when a button to accept the usage of PLI was displayed much more prominently in dialogues, while the option to reject was shown very inconspicuously (e.g. greyed out) and thus nudging users to go with the first option. When considering intrusiveness in general, we found that LBS with a full-screen location access dialogue are more likely to be intrusive than apps with a system or custom dialogue. LBS+ were overall more likely to be intrusive and tended to rely on full-screen location access requests. LBS-tended to rely on the generic location access dialogue provided by the operating system and thus were less intrusive in general.

Discussion
The results reported above have some direct implications to digital sovereignty, i.e. regarding whether users can access relevant information to decide what path of action is beneficial to them and whether they are able to convey their decisions effectively to the service. While basic information in form of privacy policies is readily available, access to basic, mandatory features is not -meaning that it is not straightforward for users to convey their wishes to the LBS. In terms of additional information and features, the opposite is true: additional information is not easy to get to but the additional location privacy feature are easily accessible (although not very prevalent). In this case, it might be difficult for users to obtain all necessary information to effectively use the additional features. This imbalance in the provision of information and access to control features is arguably detrimental to digital sovereignty as it makes forming a decision and executing them more difficult.
While GDPR related information were relatively easy to find in most of the apps (in the form of privacy policies), explanatory additional information about the usage and processing of PLI are either unavailable or limited to a few sentences. Most apps seem to provide only legally required information, while keeping further information about location privacy to a minimum. Companies and app developers most likely focus on legal requirements and the apps core functionality without concentrating on further background information.
Proactive and situational requests by LBS to be granted access to PLI are not per se problematic for digital sovereignty. However, if such requests do not allow for access to relevant information (as we observed in most cases), are persistently obtrusive or prevent access to an app at all, there are negative consequences for digital sovereignty. This is due to the fact that users cannot access information needed to make an informed decision when asked to do so. Finally, dark patterns per definition try to manipulate user behaviour to achieve outcomes, that are often not in the best interest of the user and rather benefit the service provider. As such they actively undermine or prevent digital sovereignty and also counteract the principles outlined in the GDPR.
While several dark patterns suggest a malicious intent of the creators, in practice they might simply be caused by bad design and could be unintentional (Gray et al. 2018). Both iOS and Android operating systems provide a standardised approach to access PLI, which might be not suitable for all apps. Some only need the user's current location while others rely on continuous tracking of the user. Especially apps where the core functionality is based around PLI rely on being more intrusive to some degree as they would not be able to properly function otherwise. Therefore, judging and making statements regarding dark patterns for all apps must be considered in a differentiated way. Similarly, Gray et al. (2018) highlight that dark patterns might just be anti-patterns if no malicious intent is present.
The majority of apps in our analysis were labelled with the 'obstruction' dark pattern. Most of the time it was observed that while it is easy to grant access to PLI to all apps, revoking the access is not possible in the app itself. The apps however cannot be directly blamed for this potentially poor user experience as revoking access to PLI is not a function that is available to app developers. Both iOS and Android users need to visit the operating system settings and revoke the access to PLI for each app manually, which is considerably more complex than simply giving consent to location access via clicking a button in a dialogue that pops up on app start or while using an app. Nevertheless, it is easier to grant access to PLI than revoking it. We see this imbalance as a practice that hinders digital sovereignty.
As with most studies, ours is also subject to a number of limitations. For one, the results of the analysis are strongly connected to the selection of apps. The analysis was conducted in Germany and therefore apps popular in Germany were selected. Although this might limit the generalisation of our results, our list contains mainly international apps that are available worldwide, so we consider this limitation to only have a minor impact. The focus was on apps that require access to PLI. Additionally, the categorisation of the 40 analysed apps resulted in eight categories with five apps each. Although covering a broad range of different apps, a deep analysis between categories was not possible due to a limited sample size in each category. Hence, we divided the apps only into two broader categories (LBS+ and LBS-). As the apps were not analysed in parallel, the examined app versions slightly varied in between the analyses. While a majority of Android apps can be retrieved by their specific version number, on iOS we were forced to use the current app versions that were available in the App Store. This may have caused differences between the analysed iOS apps, but no significant impacts were found during the comparison of both analyses. Although we can report a sufficient inter-rater reliability, a manual rating process with just two raters is subject to limitations. Personal preferences, opinions and experiences might have influenced the results, even though utilising a prior defined structured analysis framework (see Table 1). Especially distinguishing between a 'low'/'medium' and 'medium'/'high' rating is linked to personal preference and therefore remains at least partly subjective. A larger number of researchers conducting such an analysis would help to create more objective and universal results. Another limitation of our work is the scope and depth of the analysis. We looked at how informed consent for accessing PLI is handled. We did not look into detail what happens to the PLI after it has been shared with a LBS. This would require an in-depth analysis of the provided privacy policies and connect it to the content of the apps' consent dialogues. Going one step further, to really find out what happens to PLI shared by users, an analysis of the source code of the app itself and the used server components would be required. Since the source code of most apps is not available as open source, this is hardly possible. Another factor we did not yet look at explicitly are some location privacy control features implemented in the current iOS 14 version. Before, it was already possible to limit location usage to a one time sharing (see Figure 9). A similar feature is also available on Android 11 (see Figure 2). With iOS 14, the user does now have the option of sharing their coarse instead of precise location (see Figure 9). Although this is definitely part of controlling your data, this feature is so far only available on iOS and was hence not part of our analysis.
Based on the presented results, we propose a number of improvements for the informed consent procedures in mobile LBS. At the moment, the detachment of giving consent via clicking a button in a dialogue (see Figure 2) from the availability of privacy-related information is problematic. Although available (while sometimes hidden in sub-menus), the option of giving or revoking consent to location sharing is decoupled from information about usage of PLI. Ideally, information and control mechanisms would be combined in one place or user interface. As pointed out by others, information in the form of privacy policies is not really suitable for giving informed consent to location access (Tsohou and Kosta 2017). Hence, there is a need for explanations regarding what happens to the shared PLI in a way that is not only complete (like in privacy policies) but also comprehensible for users. In addition, users can only consent to the use of their location data but not control for what their data is used (all-or -nothing approach). Here, proper control mechanisms are needed, similar to cookie banners that let users actually chose for what purpose their data might be used (Utz et al. 2019). Given the complexity of decisions involved, it may be desirable to specify general privacy preferences at the OS level, which would then preconfigure the informed consent for individual apps. This would save users from having to manually configure this when installing a new app.

Conclusion
Regarding digital sovereignty our results show that most apps just disclose legally required information but do not provide further location privacy information of why location access is needed and how the data is processed, which makes it more difficult for users to get an idea about what actually happens with their PLI. Our analysis of 40 apps identified several dark patterns; these possibly lead to more users sharing PLI but were mostly found in LBS+ apps, which are more likely to be intrusive than LBS-apps. In addition, the users' control options are also limited and do not necessarily support them for exerting digital sovereignty over their PLI. In the end, it remains debatable whether users can currently truly give informed consent to location access and whether they are given the autonomy to actually control their PLI. Our findings suggest that there are several issues with informed consent that limit the digital sovereignty of app users regarding location information. Based on our results, it seems highly advisable to provide users with more understandable information and more fine-grained, easy-to-use control options. Regarding the latter, the option to provide one-time location access in the most recent versions of iOS and Android constitute a first (small) step in the right direction.

Disclosure statement
No potential conflict of interest was reported by the authors.

Funding
This work was supported by the German Federal Ministry of Education and Research (BMBF) under grant number 16SV8478. The content of this publication is in the responsibility of the authors.

Data availability statement
The data that support the findings of this study are openly available in OSF at https://doi.org/ 10.17605/OSF.IO/HC86A. How is GDPR handled by apps?

Accessibility of information
None • this information is not available within the app, but there might be a referral (e.g. to a website or an email address)

Low
• information is hidden in several sub-menus and difficult to find in some cases, a manual search is necessary to reach the desired information/options (for example within FAQs or some form of help center)

Medium
• information is fairly accessible/reachable within the app • however, it is not an entirely obvious navigation-path  What PLI related information is disclosed by apps?

Amount of information None
• this information is not available within the app, but there might be a referral (e.g. to a website or an email address)

Low
• information is hidden in several sub-menus and difficult to find in some cases, a manual search is necessary to reach the desired information/options (for example within FAQs or some form of help center)

Medium
• information is fairly accessible/reachable within the app • however, it is not an entirely obvious navigation-path, e.g. a few steps within the settings of an app High • information is highly accessible/reachable within the app • accessible through obvious and transparent navigation patterns, e.g. a top-level entry within the settings-section in an app

None
• this feature is not available within the app, but there might be a referral (e.g. to a website or an email address)

Low
• options are hidden in several sub-menus and difficult to find

Medium
• more broad and fine-granular options • e.g. user is able to limit the (location) data to specific usages (e.g. only functional, no analysis or ads) • privacy options to limit other users from accessing her information

High
• the user has a broad toolset of options to control her location data usages • user is in full charge of her location data and what happens to them

None
• this feature is not available within the app, but there might be a referral (e.g. to a website or an email address)

Low
• options are hidden in several sub-menus and difficult to find • in some cases, a manual search is necessary to reach the desired information/options (for example within FAQs or some form of help center)

Medium
• options are fairly accessible/reachable within the app • however, it is not an entirely obvious navigation-path, e.g. a few steps within the settings of an app High • options are highly accessible/reachable within the app • accessible through obvious and transparent navigation patterns, e.g. a top-level entry within the settings-section in an app