Technical data

Apple’s plan to scan your phone raises the stakes on a key question: Can you trust Big Tech?


[ad_1]

(THE CONVERSATION) Apple’s plan to scan customers’ phones and other devices for images depicting child sexual abuse sparked a backlash over privacy concerns, leading the company to announce a delay. Apple, Facebook, Google, and other companies have a long history of scanning customer images that are stored on company servers for this material. Analyzing data on user devices is a big change.

As well-intentioned as it is, and whether or not Apple is willing and able to deliver on its promises to protect customer privacy, the company‘s plan highlights that people who buy iPhones aren’t masters of their own devices. In addition, Apple uses a complicated analysis system which is difficult to audit. So, customers are faced with a harsh reality: If you use an iPhone, you have to trust Apple.

Specifically, customers are compelled to trust Apple to use this system only as described, to operate the system safely over time, and to put the interests of their users ahead of those of other parties, including most powerful governments on the planet.

Despite Apple’s hitherto unique plan, the trust issue is not unique to Apple. Other big tech companies also have considerable control over customer devices and insight into their data.

What is trust?

Trust is “the willingness of one party to be vulnerable to the actions of another party,” according to sociologists. People base the decision to trust on experience, signs and signals. But past behavior, promises, the way someone acts, evidence, and even contracts only give you data points. They cannot guarantee future action.

Therefore, confidence is a matter of probabilities. In a sense, you roll the dice whenever you trust someone or an organization.

Reliability is a hidden property. People collect information about a person’s likely future behavior, but cannot be sure whether the person has the ability to keep their word, is truly caring, and has the integrity – principles, process and consistency – to maintain their promise. behavior over time, under pressure or when the unexpected happens.

Trust Apple and Big Tech

Apple said its scanning system would only ever be used to detect child sexual abuse material and that it has several strong privacy protections. Technical details of the system indicate that Apple has taken steps to protect user privacy unless the targeted material is detected by the system. For example, humans will only examine a person’s suspicious material when the number of times the system detects the targeted material reaches a certain threshold. However, Apple has given little evidence on how this system will work in practice.

After analyzing the “NeuralHash” algorithm on which Apple bases its analysis system, security researchers and civil rights organizations warn that the system is likely vulnerable to hackers, contrary to Apple’s claims.

Critics also fear that the system may be used to search for other elements, such as clues of political dissent. Apple, along with other big tech players, has caved in to demands from authoritarian regimes, including China, to allow government surveillance of tech users. In practice, the Chinese government has access to all user data. What will be different this time around?

It should also be noted that Apple does not operate this system on its own. In the United States, Apple plans to use data from the National Center for Missing and Exploited Children and report suspicious items to it. So, trusting Apple is not enough. Users should also trust business partners to act with benevolence and integrity.

Big Tech’s less than encouraging record

This case exists against a backdrop of regular invasions of Big Tech privacy and tends to further restrict consumer freedoms and control. Companies have positioned themselves as responsible parties, but many privacy experts say there is too little transparency and little technical or historical evidence for these claims.

Another concern relates to unintended consequences. Apple might really want to protect kids and protect user privacy at the same time. Nonetheless, the company has now announced – and staked out its reliability – technology well suited to spying on large numbers of people. Governments could pass laws to extend digitization to other documents deemed illegal.

Would Apple, and potentially other tech companies, choose not to follow these laws and potentially pull out of these markets, or comply with potentially draconian local laws? Nothing is known about the future, but Apple and other tech companies have already chosen to accept oppressive regimes. Tech companies that choose to operate in China are forced to submit to censorship, for example.

Weigh whether to trust Apple or other tech companies

There is no single answer to the question of whether we can trust Apple, Google or their competition. The risks are different depending on who you are and where you are in the world. An activist in India faces different threats and risks than an Italian defense lawyer. Confidence is all about probability, and risks are not only probabilistic but also situational.

It’s all about the likelihood of failure or deception you can live with, what the relevant threats and risks are, and what protections or mitigations exist. Your government’s position, the existence of strong local privacy laws, the strength of the rule of law, and your own technical capabilities are relevant factors. Still, there’s one thing you can count on: Tech companies typically have extensive control over your devices and data.

Like all large organizations, technology companies are complex: employees and management come and go, and regulations, policies and power dynamics change.

A business can be trusted today, but not tomorrow.

Big Tech has shown behaviors in the past that should cause users to question their reliability, especially when it comes to privacy breaches. But they’ve also defended user privacy in other cases, such as the San Bernadino mass shooting case and subsequent encryption debates.

Finally, Big Tech does not exist in a vacuum and is not all-powerful. Apple, Google, Microsoft, Amazon, Facebook and others must respond to various pressures and external powers. Perhaps, given these circumstances, more transparency, more independent audits by journalists and civil society trusted people, more user scrutiny, more open source code, and a real discourse with customers. could be a good start to balancing different goals.

Although this is only a first step, consumers would at least be able to make more informed choices about which products to use or buy.

[ad_2]

Leave a Reply

Your email address will not be published.