After the COVID-19 pandemic stopped many asylum procedures around Europe, new technologies are actually reviving these kinds of systems. From lie diagnosis tools examined at the line to a program for validating documents and transcribes selection interviews, a wide range of solutions is being used in asylum applications. This article explores how these technologies have reshaped the ways asylum procedures are conducted. It reveals just how asylum seekers happen to be transformed into compelled hindered techno-users: They are asked to abide by a series www.ascella-llc.com/portals-of-the-board-of-directors-for-advising-migrant-workers/ of techno-bureaucratic steps also to keep up with capricious tiny changes in criteria and deadlines. This obstructs their very own capacity to run these systems and to go after their right for safeguard.
It also illustrates how these kinds of technologies happen to be embedded in refugee governance: They facilitate the ‘circuits of financial-humanitarianism’ that function through a whirlwind of distributed technological requirements. These requirements increase asylum seekers’ socio-legal precarity by hindering them from accessing the stations of protection. It further states that examines of securitization and victimization should be put together with an insight in the disciplinary mechanisms of technologies, in which migrants happen to be turned into data-generating subjects who all are disciplined by their dependence on technology.
Drawing on Foucault’s notion of power/knowledge and comarcal expertise, the article states that these systems have an natural obstructiveness. They have a double impact: whilst they help to expedite the asylum method, they also make it difficult to get refugees to navigate these systems. They can be positioned in a ‘knowledge deficit’ that makes them vulnerable to illegitimate decisions manufactured by non-governmental actors, and ill-informed and unreliable narratives about their situations. Moreover, that they pose fresh risks of’machine mistakes’ which may result in inaccurate or discriminatory outcomes.