Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Enrich data #1

Open
2 tasks
pvhee opened this issue Jan 23, 2017 · 0 comments
Open
2 tasks

Enrich data #1

pvhee opened this issue Jan 23, 2017 · 0 comments

Comments

@pvhee
Copy link
Member

pvhee commented Jan 23, 2017

  • Find out why subjects is empty?
  • Find out more structured info on programme info, call and topics?

Example project output

  rcn:                  194549
  id:                   648785
  acronym:              BODY-UI
  status:               SIGNED
  programme:            H2020-EU.1.1.
  topics:               ERC-CoG-2014
  frameworkProgramme:   H2020
  title:                Using Embodied Cognition to Create the Next Generations of Body-based User Interfaces
  startDate:            2015-05-01
  endDate:              2020-04-30
  projectUrl:           
  objective: 
    """
      Recent advances in user interfaces (UIs) allow users to interact with computers using only their body, so-called body-based UIs. Instead of moving a mouse or tapping a touch surface, people can use whole-body movements to navigate in games, gesture in mid-air to interact with large displays, or scratch their forearm to control a mobile phone. Body-based UIs are attractive because they free users from having to hold or touch a device and because they allow always-on, eyes-free interaction. Currently, however, research on body-based UIs proceeds in an ad hoc fashion and when body-based UIs are compared to device-based alternatives, they perform poorly. This is likely because little is known about the body as a user interface and because it is unclear whether theory and design principles from human-computer interaction (HCI) can be applied to body-based UIs. While body-based UIs may well be the next interaction paradigm for HCI, results so far are mixed.
      
      This project aims at establishing the scientific foundation for the next generations of body-based UIs. The main novelty in my approach is to use results and methods from research on embodied cognition. Embodied cognition suggest that thinking (including reasoning, memory, and emotion) is shaped by our bodies, and conversely, that our bodies reflect thinking. We use embodied cognition to study how body-based UIs affect users, and to increase our understanding of similarities and differences to device-based input. From those studies we develop new body-based UIs, both for input (e.g., gestures in mid-air) and output (e.g., stimulating users’ muscles to move their fingers), and evaluate users’ experience of interacting through their bodies. We also show how models, evaluation criteria, and design principles in HCI need to be adapted for embodied cognition and body-based UIs. If successful, the project will show how to create body-based UIs that are usable and orders of magnitude better than current UIs.
    """
  totalCost:            1853158
  ecMaxContribution:    1853158
  call:                 ERC-2014-CoG
  fundingScheme:        ERC-COG
  coordinator:          KOBENHAVNS UNIVERSITET
  coordinatorCountry:   DK
  participants:         
  participantCountries: 
  subjects:             
  organizations:  ...
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant