Rand Corporation report on America’s 5G Era; Balancing Big Data and Privacy. Privacy issues with the voracious data gathering ability of 5G

June 13, 2022 |

The Rand Corporation has produced an excellent paper America’s 5G Era Balancing Big Data and Privacy which highlights the threat to privacy with the introduction of 5G.

The brief summary provides:

Fifth-generation (5G) wireless networking will increase the scale of wireless networks by an order of magnitude or more. Perhaps nothing exemplifies the future of the 5G era more than the ubiquitous surveillance that is gathering more and more-diverse data on people. Even before the 5G era, data were seen as a source of new economic value.

The number of automated sensors and devices connected to wireless networks will grow in the next few years by an order of magnitude or more. Increasingly, these networks will inform artificial-intelligence algorithms, which will then autonomously make decisions and take actions — with humans directly involved only infrequently. In this report, researchers discuss how the United States should seek to balance the potential gains of the 5G era with the potential loss of privacy and of control of personal data.

Key Findings

    • As the volume, variety, and velocity of data gathered increase dramatically, both the value and the risk are likely to increase as well.
    • In the 5G era, a government could expand and automate its surveillance for infectious-disease monitoring and translate that surveillance into controls of day-to-day activity.
    • In the 5G era, law enforcement has more information than ever before, which it can fuse together a lot more quickly.
    • The 5G era, with increased bandwidth for more-connected devices, will likely continue the trend of the collection and utilization of personal data by firms, both large and small, and could contribute to a ubiquitous mobile surveillance environment.


    • Adopt an explicit principle for widespread data use during the 5G era that any potential uses of data be identified, well defined, and agreed upon before data are collected and analyzed.

It is a very thoughtful and quite complex report.  Some of the more detailed comments include:

  • The 5G era—with much more reliance on machine analyses—will move from inferences to direct measurements, gathered independently of any overt choice by the user.
  • In the 5G era, China could expand and automate its surveillance and translate that surveillance into controls of day-to-day activity. Cameras facing a home’s doorway would communicate, to an automated control center, the face and gait of anyone leaving home (Gan, 2020). AI algorithms could iden-tify the person leaving and check their COVID code with radio-frequency identification (RFID) devices in the doorframe.
  • Advanced facial and gait recognition, cross-checked with fitness-app tracks and maybe RFID hits and accel lerated by increased edge computing, has already been used to generate alerts and dispatch police
  • The 5G era, with increased band- width for more-connected devices, will  continue the trend of the collection and utilization of personal data by firms, both large and small, and could contribute to a ubiquitous mobile surveillance environment
  • the changes 5G will usher in are likely to change society drastically, much as previous generations of technologi- cal innovation have. Although this change might be gradual, the final, overall technological change is likely to be drastic simply because the amount of data will be significantly larger
  • democratic governments have also utilized mass surveillance and data collection to track people, both through systems those governments have developed and by utilizing the data collected by other means. The UK had already utilized automatic license- plate recognition to track and flag vehicle registration, insurance, and crime involvement, and similar technol- ogy has been used with drones to track out-of-area vehicles during the COVID-19 lockdown
  • When personally identifiable information is compromised, companies often simply provide custom- ers with credit-monitoring services for a limited period of time. Credit-monitoring services have also been hacked but have simply put the onus on the individual subject of the data to watch for and identify fraudulent activity on their credit reports for flagging and removal
  • As more connected devices enter homes and workplaces, cybersecurity and the safeguarding of the data collected will become increasingly urgent
  • some effort is made to anonymize data in order to protect privacy and prevent malicious use, but this has proved to be of limited benefit. this infor- mation can become deanonymized. Individual people can be identified with only location data over time
  • Location information collected and made public by the fitness-tracking app Strava might have compromised sensitive U.S. military operations when it published a heat map of users’ activities even without releasing the users’ identities 
  • use of opaque algorithmic decisions that affect indi- viduals, such as health care, insurance rates, and hiring and firing decisions, all raise questions of equity and democratic control.  several states used in sentencing proceedings to calculate recidivism rates was no more accurate than decisions by people with no criminal justice background 
  • in the 5G future, smart devices will automatically connect directly to the 5G network without a guarantee of an off switch. They already listen to conversations— even when users have not asked them to or acted to turn them on—again raising privacy concerns
  • The changes that the 5G era creates are the volume and nature of the data to be collected and exploited. Each new type of Internet of Things sensor with increased bandwidth and interconnectivity will create new dimen- sions to the information collected
  • it is important that communities and governments make decisions now about the trade- offs they are willing to accept—between the potential benefits and the risks associated with widespread mobile surveillance—ideally before the information is collected because, once out there, data cannot be taken back
  • In light of the forthcoming 5G era and, with it, the increased collection of data covering all aspects of peo- ple’s lives, there should be a more structured approach. The basic principle is that the beneficial uses be identified, well defined, and agreed upon before data are collected or new analysis is conducted. This principle also allows—and, in fact, demands—that the related ethical questions be broadly discussed in some structured manner before a system is implemented.
  • Many questions are important to consider, including what privacy users should have and what rights they have over information about them, their movements, and their activities.
  • There is an imperative to involve the relevant stakeholders in any assessment of risks, costs, and benefits is now simply a part of the process, even embedded within the relevant global standard. The problem for the risks and benefits associated with the 5G era is that the stake-holders include the general populace, increasing the challenges involved in having meaningful consultation to define the “right” choices and even communication about those choices. The public needs to view these decisions as having perceptible legitimacy. Otherwise, the decision risks being seen as the product of isolated elites—technical or economic—and so risk rejection by large parts of the public.
  • One way of handling this involves an explicit plan for iterating with the stakehold- ers at each stage in the process.  Tis approach with a diverse public can be possible only if the burden of proof is explicitly— and continually—on those proposing to gather and use data for some purpose that is beneficial or financially advantageous to them. A key characteristic of this type of iterative risk assessment is clear communication of assumptions, limitations, and associated risks. This up-front shifting of the burden of proof requires articulation and agreement on the ethical questions about what potential risks should be considered.
  • there should be guidelines to consider before building a data product: consent, clarity, consistency, control (and transparency), and consequences (and harm).
    • Consent refers to building trust with the people who are providing the data. The current standard operating procedure is to use acceptance of terms of service (ToS) as evidence of consent. ToS are not the product of a negotiation but rather a one-sided agreement in which each user consents to let the product maker do what it wants with the data it collects. And product makers can and often do unilaterally change these agreements.

    • Control and transparency refers to the amount of control the user has over their data; what data, control, and transparency are provided and collected; and how the data are used
    • consequences. There have been innumerable consequences for the collection and use of data, most of which had not been considered by the original technology developer, nor were the associated risks con- sidered in the planning of the technology. Historically, the organizations with the economic incentive to move ahead have not done the best self-policing of their products, as seen with Facebook.

  • in one model of data ownership, people could continuously own their data but be able to grant permission, for only certain uses, to a firm or firms.5 This could be combined with requirements limiting the further dis- semination of the information and with security require- ments for holding the information.
  • Another framework for data ownership could be a larger change in the basic legal structure involving personal data. The most sweeping change would be one in which people have fundamental rights to know about the data collected about themselves and have some power over the use of those data
  • A user cannot legally consent without understanding the terms to which they are consenting. Contact-tracing apps should therefore be explicit with users about the information collected, how it is used, and how their pri- vacy is or is not protected. Clarity of this information for the user will likely affect public trust in adoption of apps and thus the effectiveness of electronic contact tracing
  • so that users can trust that information is being used in the way in which they have agreed, an app’s purpose and description should not be changed without prior notification to those users.
  • The efficacy of app-based contact tracing is not a given, in that success is dependent on adoption rates and test availability.
  • The 5G era, with its ubiquitous surveillance, brings the promise of real economic gains but also the threat of great losses of privacy, anonymity, safety, and general well-being. Balancing legitimate but often-competing demands is a challenge—but one that appears ripe for meeting. Fundamentally, revenue is important, but it must not be the only metric by which technology is measured.
  • what is needed is a decision to address issues related to data and privacy directly and not to simply wait to see what happens. Adopting an explicit principle for widespread use during this 5G era—that any potential uses of data be identified, well defined, and agreed upon before data are collected and analyzed—can provide the rationale for the process and be a powerful incentive to participate for those wishing to make use of the data.

Leave a Reply