2022 Resolution on the right to privacy in the digital age

77th UNGA

Analysis of precedential value

This res­o­lu­tion was adopted by the UN Gen­eral Assem­bly (UNGA) with­out a gen­eral vote in Decem­ber 2022. The UNGA is the pre­em­i­nent gov­ern­ing body of the UN sys­tem and con­sists of all 193 UN Mem­ber States.

Used as precedent

digital health

Not­ing also that vio­la­tions and abuses of the right to pri­vacy in the dig­i­tal age can affect all indi­vid­u­als, with par­tic­u­lar effects on women, chil­dren, in par­tic­u­lar girls, per­sons with dis­abil­i­ties and older per­sons, as well as those in vul­ner­a­ble sit­u­a­tionsRec­og­niz­ing that new and emerg­ing dig­i­tal tech­nolo­gies, in par­tic­u­lar assis­tive dig­i­tal tech­nolo­gies, can espe­cially con­tribute to the full enjoy­ment of human rights by per­sons with dis­abil­i­ties, and that these tech­nolo­gies should be designed in con­sul­ta­tion with them and with appro­pri­ate safe­guards to pro­tect their rights, includ­ing their right to pri­vacyRec­og­niz­ing also that the pro­mo­tion of and respect for the right to pri­vacy are impor­tant to the pre­ven­tion of vio­lence, includ­ing gen­der-based vio­lence, abuse and sex­ual harass­ment, in par­tic­u­lar against women and chil­dren, as well as any form of dis­crim­i­na­tion, which can occur in dig­i­tal and online spaces and includes cyber­bul­ly­ing and cyber­stalk­ingReaf­firm­ing the human right to pri­vacy, accord­ing to which no one shall be sub­jected to arbi­trary or unlaw­ful inter­fer­ence with his or her pri­vacy, fam­ily, home or cor­re­spon­dence, and the right to the pro­tec­tion of the law against such inter­fer­ence, and rec­og­niz­ing that the exer­cise of the right to pri­vacy is impor­tant for the real­iza­tion of the right to free­dom of expres­sion and to hold opin­ions with­out inter­fer­ence and the right to free­dom of peace­ful assem­bly and asso­ci­a­tion, and is one of the foun­da­tions of a demo­c­ra­tic soci­etyRec­og­niz­ing the need to fur­ther dis­cuss and analyse, based on inter­na­tional human rights law, issues relat­ing to the pro­mo­tion and pro­tec­tion of the right to pri­vacy in the dig­i­tal age, pro­ce­dural safe­guards, effec­tive domes­tic over­sight and reme­dies, the impact of sur­veil­lance on the right to pri­vacy and other human rights, as well as the need to exam­ine the prin­ci­ples of non-arbi­trari­ness, law­ful­ness, legal­ity, neces­sity and pro­por­tion­al­ity in rela­tion to sur­veil­lance prac­ticesRec­og­niz­ing fur­ther the need to ensure that inter­na­tional human rights oblig­a­tions are respected in the con­cep­tion, design, devel­op­ment, deploy­ment, eval­u­a­tion and reg­u­la­tion of data-dri­ven tech­nolo­gies and to ensure that they are sub­ject to ade­quate safe­guards and over­sightRec­og­niz­ing that the right to pri­vacy is impor­tant for the enjoy­ment of other rights and can con­tribute to an indi­vid­ual’s abil­ity to par­tic­i­pate in polit­i­cal, eco­nomic, social and cul­tural life, and not­ing with con­cern that vio­la­tions or abuses of the right to be free from unlaw­ful or arbi­trary inter­fer­ence with the right to pri­vacy might affect the enjoy­ment of other human rights, includ­ing the right to free­dom of expres­sion and to hold opin­ions with­out inter­fer­ence, and the right to peace­ful assem­bly and free­dom of asso­ci­a­tionRec­og­niz­ing that, while the use of arti­fi­cial intel­li­gence can have sig­nif­i­cant pos­i­tive eco­nomic and social impacts, it requires and allows for the pro­cess­ing of large amounts of data, often relat­ing to per­sonal data, includ­ing bio­met­ric data and data on an indi­vid­ual’s behav­iour, social rela­tion­ships, race or eth­nic­ity, reli­gion or belief, which can pose seri­ous risks to the enjoy­ment of the right to pri­vacy, espe­cially when done with­out proper safe­guards, in par­tic­u­lar when employed for iden­ti­fi­ca­tion, track­ing, pro­fil­ing, facial recog­ni­tion, clas­si­fi­ca­tion, behav­iour pre­dic­tion or scor­ing of indi­vid­u­alsNot­ing that the use of arti­fi­cial intel­li­gence may, with­out proper tech­ni­cal, reg­u­la­tory, legal and eth­i­cal safe­guards, pose the risk of rein­forc­ing dis­crim­i­na­tion, includ­ing struc­tural inequal­i­ties, and rec­og­niz­ing that racially and oth­er­wise dis­crim­i­na­tory out­comes should be pre­vented in the design, devel­op­ment, imple­men­ta­tion and use of emerg­ing dig­i­tal tech­nolo­giesNot­ing with con­cern that cer­tain pre­dic­tive algo­rithms are likely to result in dis­crim­i­na­tion, includ­ing when non-rep­re­sen­ta­tive data are usedNot­ing that the use of algo­rith­mic or auto­mated deci­sion-mak­ing processes online can affect the enjoy­ment of indi­vid­u­als’ rights offlineStress­ing also the need to ensure that national secu­rity and pub­lic health mea­sures, includ­ing the use of tech­nol­ogy to mon­i­tor and con­tain the spread of infec­tious dis­eases, are in full com­pli­ance with the oblig­a­tions of States under inter­na­tional human rights law and adhere to the prin­ci­ples of law­ful­ness, legal­ity, legit­i­macy with regard to the aim pur­sued, neces­sity and pro­por­tion­al­ity and the need to pro­tect human rights, includ­ing the right to pri­vacy, and per­sonal data in the response to health or other emer­gen­ciesNot­ing the impor­tance of pro­tect­ing and respect­ing the right of indi­vid­u­als to pri­vacy when design­ing, devel­op­ing or deploy­ing tech­no­log­i­cal means in response to dis­as­ters, epi­demics and pan­demics, espe­cially the coro­n­avirus dis­ease (COVID-19) pan­demic, includ­ing dig­i­tal expo­sure noti­fi­ca­tion and con­tact trac­ingNot­ing also that new and emerg­ing dig­i­tal tech­nolo­gies can con­tribute to the recov­ery from global health emer­gen­cies, includ­ing the COVID-19 pan­demic, and recall­ing in this regard the impor­tance of pro­tect­ing human health-related and other per­sonal data, while not­ing with con­cern that some efforts to com­bat the COVID-19 pan­demic have an adverse impact on the enjoy­ment of the right to pri­vacyAffirms that the same rights that peo­ple have offline must also be pro­tected online, includ­ing the right to pri­vacy, with spe­cial regard given to the pro­tec­tion of chil­drenAcknowl­edges that the con­cep­tion, design, use, deploy­ment and fur­ther devel­op­ment of new and emerg­ing tech­nolo­gies, such as those that involve arti­fi­cial intel­li­gence, may have an impact on the enjoy­ment of the right to pri­vacy and other human rights, and that the risks to these rights can and should be avoided and min­i­mized by adapt­ing or adopt­ing ade­quate reg­u­la­tion or other appro­pri­ate mech­a­nisms, in accor­dance with applic­a­ble oblig­a­tions under inter­na­tional human rights law, for the con­cep­tion, design, devel­op­ment and deploy­ment of new and emerg­ing tech­nolo­gies, includ­ing arti­fi­cial intel­li­gence, by tak­ing mea­sures to ensure a safe, trans­par­ent, account­able, secure and high qual­ity data infra­struc­ture and by devel­op­ing human rights-based audit­ing mech­a­nisms and redress mech­a­nisms and estab­lish­ing human over­sight