Post-Doc in CS/HCI: Smart and semi-automatic interfaces
We want to explore an approach combining AI algorithms with user experience design methods. This approach is called “Human-Centered AI” in which Artificial Intelligence (AI) algorithms are combined with human-centered thinking to make Human-Centered AI (HCAI) [1]. This approach combines research on AI algorithms with user experience design methods to shape technologies that amplify, augment, empower, and enhance human performance. Researchers and developers for HCAI systems value meaningful human control, putting people first by serving human needs, values, and goals. More specifically, we will explore a sub domain of HCAI, called “interactive machine learning”. Interactive Machine Learning (IML) is the design and implementation of algorithms and intelligent user interface (IUI) frameworks that facilitate machine learning (ML) with the help of human interaction, i.e. it is an interaction paradigm in which a user or a user group refines a mathematical model to describe a concept through iterative cycles of input and review [2,8]. The idea is to go beyond considering AI as a black box and accepting its output, we will refine and adjust the output with the user. Based on experiments on real use cases extracted from the RUD project, our goal is to identify a general process in order to provide semi-automatic indexing and clustering. Taking into account the different interactive machine learning strategies, recent work [3] shows that by giving the user (novice in AI but domain expert) a more proactive role, the performance can be increased. But what about the role of the interaction techniques? How do the interaction metaphors influence the processes? How does the user progress towards the task objectives? Previous work among the RUD partners shows that it is much less expensive and more efficient to get the user to intervene throughout the process, and especially at the beginning, rather than at the end to validate or correct errors. Involving users in semi-automatic interfaces [4] may reduce and even cancel the error rate for all tasks [5].
In terms of methodology, it will be explored, and compared by user experiments, WIMP (Windows, Icons, Menus, Pointer) interfaces and post WIMP interfaces [6]. Post WIMP interfaces comprise work on user interfaces which go beyond the paradigm of windows, icons, menus and a pointing device. We will follow existing guidelines for Human-AI Interaction [7]. We will adopt common language and common concepts of [8] as well as the principles based on a common workflow for building effective interfaces for IML. It will be necessary to carefully establish a productive dialog between user and software (framing feedback, capturing input, explaining decision) as well as to produce a rich interaction in order to enhance interactivity. The challenge is to deal with users considered as knowledgeable, but imprecise and inconsistent, and hence to infer user intent from user input. Once the WIMP interface and the post WIMP interface are prototyped, an inspection and user experience will be conducted in order to verify that the new produced versions are better than the existing ones, and also to compare pros and cons of the two paradigms WIMP and post WIMP related to the tasks.
Required profile
Education – The candidate must have a PhD in computer science with HCI and/or AI skills.
Other abilities – The candidate should be highly motivated for this project and have strong analytic skills and must be able to code in the Python language. The candidate is expected to conduct user studies. The candidate should be sociable, curious, autonomous and rigorous. The candidate should be able to speak about his/her project with industrial and academic partners. Also, the candidate should be interested in teaching in English and/or in French.
Application
The application must include a CV and a cover letter. If possible, a letter of recommendation or the name of a reference person could be added to the application. Applications must be sent by email to Pr Nadine Couture as soon as possible and before 2021, 10th of September, n.couture@estia.fr. Selected candidates will be invited to meet the scientific supervisors during an interview, around mid September 2021.
Dates Reminder
*Sending applications as soon as possible (latest 10th, of September, 2021)
*Interviews: September 2021 *PostDoc start: October 2021 18 months
Location
* ESTIA-Recherche Bidart, France
*Possible short periods with specific outputs and programme in the partners SMEs of the RUD Project
Poste Doctorant F/H Immersive and Situated Visualizations of Personal Data
Informations généralesThème/Domaine : Interaction et visualisationInstrumentation et...
Thèse : « «Dynamique d’interactions tactiles et cognition sociale » à l’UTC
Poursuivez ici selon votre inspiration...Type de financement : Demi-bourse région + demi-bourse...
Internship on Human Robot Interaction evaluation in Virtual Reality
Keywords:Human Machine Interaction, Human Robot Collaboration, Virtual Reality, Virtual Reality...
Internship on Intelligent Tutoring System in Virtual Reality
Keywords: Virtual Reality Training System (VRTS), Data collection and analysis, Machine...
Postdoctoral researcher / Research engineer (AR/VR)
Keywords: Human Machine Interaction, Virtual Reality, Virtual Reality Training Systems (VRTS),...
Thèse: Prototypage rapide d’interactions en RA/RV
Mots-clés : Interaction homme – machine (IHM), réalités virtuelle et augmentée (RV&A), génie...