After the COVID-19 pandemic halted many asylum procedures around Europe, new technologies are actually reviving these types of systems. Out of lie detection tools tested at the edge to a program for validating documents and transcribes interviews, a wide range of solutions is being utilized in asylum applications. This article explores how these technologies have reshaped the ways asylum procedures will be conducted. It reveals how asylum seekers are transformed into pressured hindered techno-users: They are asked to conform to a series www.ascella-llc.com/the-counseling-services-offers-free-confidential-counseling-services-to-enrolled-students of techno-bureaucratic steps also to keep up with capricious tiny changes in criteria and deadlines. This obstructs their particular capacity to run these systems and to go after their legal right for coverage.
It also demonstrates how these types of technologies happen to be embedded in refugee governance: They help in the ‘circuits of financial-humanitarianism’ that function through a whirlwind of dispersed technological requirements. These requirements increase asylum seekers’ socio-legal precarity simply by hindering these people from being able to view the channels of safeguards. It further argues that studies of securitization and victimization should be put together with an insight in the disciplinary mechanisms of technologies, in which migrants are turned into data-generating subjects who all are disciplined by their reliance on technology.
Drawing on Foucault’s notion of power/knowledge and comarcal expertise, the article argues that these solutions have an natural obstructiveness. There is a double impact: although they help to expedite the asylum procedure, they also produce it difficult pertaining to refugees to navigate these types of systems. They can be positioned in a ‘knowledge deficit’ that makes all of them vulnerable to bogus decisions made by non-governmental actors, and ill-informed and unreliable narratives about their situations. Moreover, they will pose fresh risks of’machine mistakes’ that may result in inaccurate or discriminatory outcomes.