Skip to main content

Biased Algorithms and the Discrimination upon Immigration Policy

  • Chapter
  • First Online:
Law and Artificial Intelligence

Abstract

Artificial intelligence has been used in decisions concerning the admissibility, reception, and even deportation of migrants and refugees into a territory. Since decisions involving migration can change the course of people's lives, it is imperative to verify the neutrality of the algorithms used. This chapter analyses how AI has been applied to the decision-making process regarding migration, mainly evaluating whether AI violates international pacts related to the protection of human rights. This chapter considers the case studies of Canada, New Zealand, the United Kingdom, and a pilot project that might be implemented in Europe. It is concluded that automated decisions regarding immigration have the potential to discriminate against migrants, and likely have been doing so since their creation, due to intrinsic biases present in the current application methods and the systems themselves. Possible solutions that might help these systems provide equal treatment to migrants consist of greater transparency regarding the variables used in training. Consistent evaluations of methods and performance, to detect and remove biases emerging from historical data or structural inequity might also be a solution.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Chapter
USD 29.95
Price excludes VAT (USA)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
USD 79.99
Price excludes VAT (USA)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
USD 99.99
Price excludes VAT (USA)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info
Hardcover Book
USD 99.99
Price excludes VAT (USA)
  • Durable hardcover edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

Notes

  1. 1.

    International Organization for Migration 2019.

  2. 2.

    See UNHCR 2020a

  3. 3.

    This chapter studies the examples of Hungary, Latvia and Greece.

  4. 4.

    See United Nations 1948.

  5. 5.

    See UNHCR 1951. According to article 3 of the Convention, “The Contracting States shall apply the provisions of this Convention to refugees without discrimination as to race, religion or country of origin”.

  6. 6.

    See UNHCR 2020b.

  7. 7.

    See Article 2.2 ICESCR. United Nations 1966.

  8. 8.

    See Article 4.1 ICCPR. United Nations 1966.

  9. 9.

    See Article 1.1 ICERD. United Nations 1965.

  10. 10.

    See Article 1 CEDAW. United Nations 1979.

  11. 11.

    See Article 5 CRPD. United Nations 2006.

  12. 12.

    See Article 2.1 CRC. United Nations 1989.

  13. 13.

    See Article 7 Universal Declaration Of Human Rights. United Nations 1948.

  14. 14.

    See objective n. 17 of the Intergovernmentally Negotiated and Agreed Outcome 2018.

  15. 15.

    See topic III, “Program Of Action”, Section B “Areas in need of support”, 2.10. of The Global Compact on Refugees. United Nations 2018.

  16. 16.

    The concepts of voluntary and forced migration will be further explored in Sect. 10.6.

  17. 17.

    Home Office (undated) About us. https://www.gov.uk/government/organisations/home-office/about Accessed 23 January 2021.

  18. 18.

    The Joint Council for the Welfare of Immigrants 2020.

  19. 19.

    See n. 18.

  20. 20.

    See n. 18.

  21. 21.

    The Equality Act 2010 is an internal UK legislation, see https://www.legislation.gov.uk/ukpga/2010/15/contents Accessed 23 January 2021.

  22. 22.

    See iBorderCtrl undated; Molnar 2019a, p.2.

  23. 23.

    See n. 22.

  24. 24.

    Bonnett 2018; Tan 2018.

  25. 25.

    Bashir 2018; Robson 2018.

  26. 26.

    Molnar and Gill 2018.

  27. 27.

    See n. 26.

  28. 28.

    See n. 26

  29. 29.

    See n. 26.

  30. 30.

    See n. 26.

  31. 31.

    See n. 26.

  32. 32.

    See n. 26.

  33. 33.

    Beduschi 2020, p. 10.; Molnar 2019a, p. 2.

  34. 34.

    It is argued that “As such, bias can be introduced into every stage of the development and deployment of systems: as from the intention that initially governs the algorithm’s development, during the creation of the computer code, the executable code, during execution, in the context of execution and maintenance”. Défenseur des droits and Commission Nationale Informatique & Libertés 2020.

  35. 35.

    See Kearns and Roth 2020, p. 61 and p. 87.

  36. 36.

    Kleinberg et al. 2019, p. 4.

  37. 37.

    See Borgesius 2018.

  38. 38.

    See n. 37.

  39. 39.

    Barocas and Selbst 2016, pp. 677 to 693; Kleinberg et al. 2019, pp. 17 and 18.

  40. 40.

    Barocas and Selbst 2016, p. 678.

  41. 41.

    Kleinberg et al. 2019, p. 18.

  42. 42.

    Barocas and Selbst 2016, pp. 678 and 680; Kleinberg et al. 2019, pp. 21 and 22.

  43. 43.

    Barocas and Selbst 2016, pp. 681, 683 and 684.

  44. 44.

    Barocas and Selbst 2016, p. 681; Kleinberg et al. 2019, p. 22.

  45. 45.

    Barocas and Selbst 2016, p. 688; Kleinberg et al. 2019, p. 18.

  46. 46.

    Barocas and Selbst 2016, p. 688.

  47. 47.

    Kleinberg et al. 2019, p. 22.

  48. 48.

    Barocas and Selbst 2016, pp. 691 and 692.

  49. 49.

    Barocas and Selbst 2016, p. 691.

  50. 50.

    Barocas and Selbst 2016, p. 692.

  51. 51.

    See n. 50.

  52. 52.

    Kleinberg et al. 2019, p. 23.

  53. 53.

    Kearns and Roth 2020, p. 68.

  54. 54.

    Kearns and Roth 2020, p. 73.

  55. 55.

    Kearns and Roth 2020, pp. 68 and 69.

  56. 56.

    Kearns and Roth 2020, p. 63.

  57. 57.

    Kearns and Roth 2020, pp. 67 and 77.

  58. 58.

    Kearns and Roth 2020, p. 67.

  59. 59.

    Smith and Rustagi 2020.

  60. 60.

    Molnar 2019b, p. 321.

  61. 61.

    See n. 2.

  62. 62.

    Baynes 2019; Molnar 2019a, p. 2.

  63. 63.

    See n. 37.

  64. 64.

    Défenseur des droits and Commission Nationale Informatique & Libertés 2020.

  65. 65.

    See n. 64.

  66. 66.

    See n. 64.

  67. 67.

    See n. 37.

References

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2022 T.M.C. Asser Press and the authors

About this chapter

Check for updates. Verify currency and authenticity via CrossMark

Cite this chapter

Laupman, C., Schippers, LM., Papaléo Gagliardi, M. (2022). Biased Algorithms and the Discrimination upon Immigration Policy. In: Custers, B., Fosch-Villaronga, E. (eds) Law and Artificial Intelligence. Information Technology and Law Series, vol 35. T.M.C. Asser Press, The Hague. https://doi.org/10.1007/978-94-6265-523-2_10

Download citation

  • DOI: https://doi.org/10.1007/978-94-6265-523-2_10

  • Published:

  • Publisher Name: T.M.C. Asser Press, The Hague

  • Print ISBN: 978-94-6265-522-5

  • Online ISBN: 978-94-6265-523-2

  • eBook Packages: Law and CriminologyLaw and Criminology (R0)

Publish with us

Policies and ethics