You can not select more than 25 topics Topics must start with a letter or number, can include dashes ('-') and can be up to 35 characters long.
 
 
 
 
 
cerkut 5a112d9bac Small changes before F22 revision 5 months ago
02_files L3 Post-lecture uploads. 2 years ago
03-files L3 Post-lecture uploads. 2 years ago
Assignment-Kinematic_and_LMA descriptors Preps for 2021 1 year ago
Mocap_Resources Post-seminar addons: Mixamo and rokoko studion in Mocap resources 1 year ago
data/89/126762-EDE1-4D11-8D7B-625B0AB23852 Update for special reexam before the F2022 revision 5 months ago
figures Preps for EI-3: Technologies 1 year ago
.gitignore Interim commit to keep up with changes. 1 year ago
03.ipynb Post L6 pre L7 participatory sensemaking session 1 year ago
Miniproject-Instructions.org First stab at the mini-project instructions 1 year ago
MiniprojectIdeas-feedback.txt Interim commit on WHY and mini-projects 2 years ago
Miniprojects-2020.org Small changes before F22 revision 5 months ago
Miniprojects-2021.org Update for special reexam before the F2022 revision 5 months ago
README-Mocap_Resources.org Post L6 pre L7 participatory sensemaking session 1 year ago
README.org Small changes before F22 revision 5 months ago
mini-2020.bib Interim commit to keep up with changes. 1 year ago
mini-2020.org Interim commit to keep up with changes. 1 year ago

README.org

Embodied Interaction

New to 2021

Mocap Toolbox

CLOCK: [2021-02-09 Tue 01:12]–[2021-02-09 Tue 14:16] => 13:04

Mocap Resources   PUBLISHED MOODLE

https://www.moodle.aau.dk/mod/folder/view.php?id=1207074 /GP43NC/smc8-courses-embodied-interaction/src/branch/master/Mocap_Resources

DONE 1- INTRO [5/7]

CLOSED: [2021-02-08 Mon 23:24] SCHEDULED: <2020-02-04 Tue 09:00-16:00>

CLOCK: [2020-02-04 Tue 09:00]–[2020-02-04 Tue 15:20] => 6:20 CLOCK: [2019-02-05 Tue 05:22]–[2019-02-05 Tue 05:53] => 0:31 CLOCK: [2018-02-08 Thu 16:15]–[2018-02-08 Thu 16:40] => 0:25 CLOCK: [2018-02-08 Thu 14:34]–[2018-02-08 Thu 16:15] => 1:41 CLOCK: [2018-02-07 Wed 14:23]–[2018-02-07 Wed 14:23] => 0:00 CLOCK: [2018-02-06 Tue 15:42]–[2018-02-06 Tue 16:07] => 0:25

After completing EI with your mini-project, you'll be able to

  • distinguish the three key directions in theories of embodiment, after cite:Svanaes-2020-CHI, In 2020 this was \cite{Hornecker:2017we}.

    In 2021 this is cite:Svanaes-2020-CHI

\begin{aligned}
&\begin{array}{|c|c|c|c|}
\hline \begin{array}{c}
\text { Point-of-view/ } \\
\text { Tense }
\end{array} & \text { Past } & \text { Present } & \text { Future } \\
\hline \text { lst - Me } & \begin{array}{c}
\text { Accessing } \\
\text { memories of } \\
\text { how it felt for } \\
\text { me in the past. }
\end{array} & \begin{array}{c}
\text { Awareness of } \\
\text { how it feels for } \\
\text { me here and } \\
\text { now. }
\end{array} & \begin{array}{c}
\text { Awareness of } \\
\text { how it feels for } \\
\text { me when I am } \\
\text { enacting a } \\
\text { possible future. }
\end{array} \\
\hline 2 \text { nd - You } & \begin{array}{c}
\text { Empathically } \\
\text { observing } \\
\text { recordings of } \\
\text { someone else in } \\
\text { the past. }
\end{array} & \begin{array}{c}
\text { Empathically } \\
\text { observing } \\
\text { someone else } \\
\text { here and now. }
\end{array} & \begin{array}{c}
\text { Empathically } \\
\text { observing } \\
\text { someone else } \\
\text { enacting a } \\
\text { possible future. }
\end{array} \\
\hline \begin{array}{c}
3 \text { rd - } \\
\text { He/She }
\end{array} & \begin{array}{c}
\text { Analytically } \\
\text { observing } \\
\text { recordings of } \\
\text { one self or } \\
\text { someone else in } \\
\text { the past. }
\end{array} & \begin{array}{c}
\text { Analytically } \\
\text { observing one } \\
\text { self or someone } \\
\text { else here and } \\
\text { now. }
\end{array} & \begin{array}{c}
\text { Analytically } \\
\text { observing one } \\
\text { self or someone } \\
\text { else enacting a } \\
\text { possible future. }
\end{array} \\
\hline
\end{array}\\
&\text { able } 1 \text { . A } 3 x 3 \text { matrix of Point-of-View and Tense }
\end{aligned}
  • identify mover-observer-machine perspectives in EI cite:Loke:2013ic

  • understand the needs of

    • movement as a design material

    • developing bodily skills cite:Loke:2013ic, and

    • stream data from SmartSuit Pro to Unity/Unreal

DONE [B] EI01-CODE [3/4]

CLOSED: [2018-02-08 Thu 18:02]

DONE Processing

CLOSED: [2018-02-08 Thu 16:34]

See https://github.com/cerkut/EI18-Vitruvian-Processing

DONE RAM DANCE TOOLKIT

CLOSED: [2018-02-08 Thu 16:34]

  • download RAM Dance Toolkit v1.3.0 for oF v0.9.8 (released on 23 Oct 2017) https://github.com/YCAMInterlab/RAMDanceToolkit/releases

    Below we look into OSX version. Windows users should and optionally Motion Data OSC Server

    wget https://github.com/YCAMInterlab/RAMDanceToolkit/releases/download/v1.3.0/RAM-app_osx_v1_3_0.zip
    wget https://github.com/YCAMInterlab/RAMDanceToolkit/releases/download/v1.0.0/RAM-OSCServer_mac-v1_0_0.zip
    

    unzip the files.

  • unzip RAM-app_osx_v1_3_0, launch the RAM Dance Toolkit.app within the folder

    If the app opens and you see the debug menu above the checkerbox floor, all fine, proceed. If not, see the note at https://github.com/YCAMInterlab/RAMDanceToolkit/releases

  • Load a recorded movement data from data/Resources/MotionData by drag & drop file to the app You can load up to five movement data files Press TAB to switch between different UIs

  • Try some of the Effects on Actors (e.g., Keppler)

  • Learn more at https://github.com/YCAMInterlab/RAMDanceToolkit/wiki/Overview

DONE Optional: Motion Data OSC Server

CLOSED: [2018-02-08 Thu 16:34]

The Motion Data OSC Server sends OSC messages to the RAMDanceToolkit, useful for testing.

This application will send OSC messages to the RAMDanceToolkit when you drag and drop XML files onto the server app screen, instead of the client window.

TODO OSC Namespace

https://github.com/YCAMInterlab/RAMDanceToolkit/wiki/Structure-of-RAMDanceToolkit

<String> ActorName <Int> #Nodes Array of Nodes <f> Message Timestamp

DONE [B] ROKOKO SmartSuit Pro + Unity   MAC

CLOSED: [2018-02-08 Thu 17:

# wget  https://api.rokoko.com/v1/download/software/smartsuit-studio/SmartsuitStudioInstaller_1_3_1b.exe
  wget https://api.rokoko.com/v1/download/software/smartsuit-studio/SmartsuitStudio_OSX_1_3_1b_installer.dmg

https://cdn.rokoko.com/software/smartsuit-studio/RokokoStudio_osx_1_7_0b_installer.dmg http://help.rokoko.com

  • Launch the app, open SmartsuitDemo Project by pressing the arrow

  • Select one of the recordings in scene-1

  • Go leftfmost icon / Advanced settings / Network settings Enable Forward data Set Forward port to anything else than 14041 (this is used for communication with the actual suit) 14042 is good.

  • Create a new project, when launched go to Window / AssetStore, search for Rokoko and get the Smartsuit Plugin.

  • Import the Smartsuit Plugin (ALL)

  • Open the scene under Rokoko/Smartsuit/Examples/SmartsuitExampleActor

  • Go in Hierarchy to SmartsuitReceiver and change the bold-face PortRangeStart and PortRangeEnd to the value you used in Smartsuit Studio (14043).

  • Press Play, your humanoid avatar should sync to the one on Smartsuit Studio (14043)

  • You can now add components to the SmartsuitActor, e.g. Particles, Lines, or other Visual Effects https://docs.unity3d.com/Manual/comp-Effects.html

  • Unicast/Broadcast governed by an icon near to the suit on the right.

EI-SmartSuit101 @ Collab 2018.3.0f2

To demonstrate RT usage of SmartSuit

00 - Primitives 01 - Particles 02 - FBX

More to learn at https://www.rokoko.com/en/learn/

TODO SmartSuit Pro + Unreal

TODO Table to Moodle. Microproject implementation / report

CLOCK: [2018-02-07 Wed 15:36]–[2018-02-08 Thu 12:57] => 21:21

To represent the observer perspective, on Unity/Unreal/Processing etc., put at least

Challenge

make it dynamic (with Rokoko Studio). For the curious, here is a dynamic, first-person representation of the felt movement: https://goo.gl/images/9oE1ra Oskar Schlemmer, Egocentric Space Delineation

2-The Embodied Alternative 9.2

Asahi I Body Map
Jelle 1 P1 EI Tech Definition 8 parts
P2
P3

MATLAB

Learning outcomes: after this session, you will be able to

understand the bodily skills needed for technological development, decision making, steering, and path finding in Games via AI apply methods and techniques to real world scenarios (games) and project concepts analyze, compare, and assess the potential of different methods and techniques in order to make the proper design choices in games Please check out the links in the MATERIAL before the lecture, and start thinking how to integrate these elements in your mini-project design.

3-Technologies

https://sway.office.com/DwYt3g5WrXUL8Z91?ref=Link

4-External representation

CLOCK: [2021-02-27 Sat 06:42]–[2021-02-27 Sat 07:59] => 1:17

EI-4 External representations (23.2) on [[https://teams.microsoft.com/l/message/19:5c75652b76c34e63a5d3159a963fa95f@thread.tacv2/1614001568573?tenantId=f5dbba49-ce06-496f-ac3e-0cf14361d934&amp;groupId=18e0bcb5-1194-4ba4-9035-19d80ae4e424&amp;parentMessageId=1614001568573&amp;teamName=2021-Embodied Interaction&amp;channelName=General&amp;createdTime=1614001568573][TEAMS]]

The breakdown of EI-4: we will meet on Teams, using this channel

5-Socially Situated Practices

CLOCK: [2021-02-26 Fri 14:37]–[2021-02-26 Fri 14:51] => 7:14

  • 08:45 - 09:00 The Movement Stream (not mandatory)

  • [ ] 09:00 - 09:20 Brief feedback on descriptions of mini project ideas

  • 09:20 - 09:50 Recap and discussion Socially Situated Practices with examples from other activities. Please prepare by watching videos Socially Situated Practices 80 minutes). https://www.moodle.aau.dk/mod/page/view.php?id=1175703

  • 10:00 - 10:50 different perspectives in interaction design and embodied music cognition. Activities based on Hornecker, Marshall & Hurtienne 2017(in Files) also Svæness?

  • 11:00 - 11:50 Extracting features from Mocap Data; Mocap Toolbox work. Download script (danceDataFeatures.m) and data in zipped file in files and on moodle There will be an assignment with simple data description and analyisis.

6-Action / Perception Coupling 9.3 (Sofia)

7-Participatory Sensemaking 16.3

8-Phenomenology and Somaesthetics

9-Checkpoint: Towards mini-projects 6.4

Mini Seminar & Workshop 20.4

COMMENT 2020 Organization

Intro

https://www.moodle.aau.dk/course/view.php?id=25152#section-1

Intro/Wrap-up slides: https://drive.google.com/open?id=1SgEQYxzRvBRdpUyKLktIf27NZLwL2-43

a miniature version of the course with both theoretical and practical elements.

Theoretical and practical approaches: Soma Design Theory 1

CLOCK: [2018-02-22 Thu 09:07]–[2018-02-22 Thu 09:44] => 0:37 CLOCK: [2017-03-02 Thu 09:09]–[2017-03-02 Thu 15:32] => 6:23 CLOCK: [2017-03-01 Wed 21:06]–[2017-03-01 Wed 21:12] => 0:06

  • How can the type of action we do affect our perception?

  • How can changing the perspective aid in the design process?

  • What type of tools can we use to describe and characterize the movements and interactions we design for?

After this session, you'll be able to

  • understand how our bodies affect perception and action

  • identify different perspectives used in design and describe how these affect the design process and outcomes.

  • compare and apply different approaches to describe and characterize movements:

    • physical-based descriptors such as

      • position

      • velocity

      • acceleration

      • jerk

    • hi

    Preperation

    DONE Read: Perception Viewed as a Phenotypic Expression

    CLOSED: [2017-03-02 Thu 09:39]

    CLOCK: [2017-03-01 Wed 21:12]–[2017-03-02 Thu 09:09] => 11:57

    cite:Proffitt:2013tv

    Skim: A brief overview of Laban Movement Analysis
  • Laban Movement Analysis :: <<<LMA>>> is a theoretical and experiential system for the observation, description, prescription, performance, and interpretation of human movement.

TASKS/ EXERCISES

And describe it using Laban Movement Analysis (Effort). A table of EFFORT briefly describing the different characteristics can be found in this document

Assignment

CLOCK: [2020-02-18 Tue 01:01]–[2020-02-18 Tue 01:46] => 0:45

Submit a pdf with your (brief) answers to the following tasks:

  1. Select two movement videos in the folder with MOCAP FILES here below to study more in detail.

  2. Try to describe the selected movements in terms of Laban Effort (Space, Time, Weight, and Flow) Which computable movement descriptors (link to the paper by Larboulette & Gibet) would seem good to use to describe and separate the movement characteristics of the two videos?

  3. In the same folder you will find ascii files with simple movement descriptors ( position, derivatives, hand distance) from the folder below. Use a program of your choice to load and plot the files over time. Compare, and try to match with the movements shown in the available avi files. Which of these movement videos are you looking at? What other movement features might be usable to compute /use to compare?

Details on data files:

SmoothedPos.tsv - Smoothed Position data of three markers 1,2 3,(x,y,z) Velocity.tsv - Velocity data, three markers (x, y, z) Acceleration.tsv - Acceleration data, three markers (x, y, z) Handdist.dat - distance between hands (markers 2,3), vector

Examples of Lab

TODO Perception, Perspectives, and Movement as design material

CLOCK: [2018-03-22 Thu 09:00]–[2018-03-22 Thu 11:20] => 2:20

https://www.moodle.aau.dk/course/view.php?id=25152#section-6

  • How can the different perspective aid and affect the design process and outcomes?

  • How does our bodies affect perception and action?

  • What are the similarities between movement and any other material in designing interactive systems?

  • How can the developers/designers develop and use their bodily skills?

PREPERATION: READING

Lecture Material

/GP43NC/smc8-courses-embodied-interaction/src/branch/master/MEDIALOGY/screenshot_2018-03-22_16-33-41.png

Kung Fu Motion Visualization on Vimeo by Tobias Grimler
Fabric weaved by Time
Velocity transforms into Matter
Body transfroms matter
Form follows time ()

DISCUSSION on Miniprojects and Perspectives Page

DISCUSSION EXERCISE

Select one of the visualizations of motion capture data for Kung Fu, dance, and Music Conducting.

Suppose that the visualizations/sounds/feedback were interactive in real time.

  • How will the programming of the visualization promote certain quality of movement for the mover? That is, what type of movements would you expect users to do with this type of feedback/interaction?

    Kung Fu Motion Visualization on Vimeo 🔊

    • Variation 2:

    • Variation 3.1: Expanding into emptines

Kung-fu: 
- body: circular movements within own kinesphere
  flexible 
- sword
  • Suppose you want the quality of interaction to be drastically different? (For instance slow and smooth instead of fast and jerky). How could the parameters of the visualizations/sounds/feedback be changed to encourage this type of movements instead?

TODO Reflection on movement exercises

CLOCK: [2018-03-22 Thu 16:21]–[2018-03-22 Thu 16:46] => 0:25

Please fill in a brief reflection on how doing the movement exercises felt and what you learned through them.

The exercises were:

  • Body scan (sitting, feet to head) @ class paying attention to the position, direction and contact of the body.

  • Rolling heel-to-toe, falling into step (balance, contact to floor, and what foot we step with). left.

  • Walking with different movement qualities (honey/oil; being a stick person; being a glass person; being a rubber person) (one dimension: viscosity, another quantitiy of movement, plus scake

  • Leader-follower (one person with eyes closed following the lead of another).

  • Changing perspectives (standing on desk, sitting under it)

TODO How of EI: Movement Descriptors (computational)

Sensors

Motion Capture (MoCap)

Data

trajectories of the markers (inc. virtual) are often mapped to a virtual skeleton:

  • defined by a hierarchy of joints and angle rotations

  • ensures that the body limbs have fixed lengths.

Mocap data are provided in different formats. e.g.,

  • C3D (3D marker positions)

  • Acclaim,

  • Bio-vision (BVH), and

  • Vicon (both the skeleton and the motion data);

  • text (comma or space delimited);and

  • more general 3D asset formats such as COLLADA and FBX.

/GP43NC/smc8-courses-embodied-interaction/src/branch/master/02_files/BVH_skeletal_structure.jpg

http://effect.motionbank.org/ https://github.com/motionbank/effect-data-player https://github.com/motionbank/effect-player-examples https://medium.com/motion-bank/choreographic-coding-effect-b0ab5501c069

Preprocessing

Feature Extraction

https://github.com/SMC7-2019/ASDF-RNN

Processing

https://github.com/SMC7-2019/ASDF-RNN

Display

https://github.com/SMC7-2019/ASDF-RNN

Training Somaesthetic Skills

Change and intersest

Distrupting the habitual – estrangement

Another method frequently mentioned by experts in soma-based design is to slow down or disrupt a habitual movement

  • to be able to discern small changes,

  • to note how the movements relate to your emotional experiences,

  • to enjoy or feel pain, and

  • to be engaged cite:Bell:2005ut, cite:Wilde:2017chi, Schiphorst 2007;

DONE MoCap: Data Serialization

CLOSED: [2020-03-24 Tue 17:44] DEADLINE: <2018-03-14 Wed>

CLOCK: [2018-03-03 Sat 10:05]–[2018-03-03 Sat 10:07] => 0:02

Last update: <2016-10-19 Wed> Last visit : <2018-03-03 Sat>

https://www.moodle.aau.dk/course/view.php?id=25152#section-4

We will look at examples of MoCap data and tools to analyze.

Then we divide into groups to test out

  • the Motion Capture system in Multisensory Experience lab

  • Rokoko Smart Suit (you can download smart suit studio)

We will do some data collection and practical exercise of getting in (and analysing) movement data.

TOOL Formats DOC
MOKKA C3D, other BIOMEC
ofxMotionMachine

Preparation / Lecture Material / Assignment Folder

EXERCISES with materials from earlier years Folder

TODO FirstPersonReflection of Mocap   DATA

SCHEDULED: <2020-01-06 Mon>

CLOCK: [2018-08-28 Tue 15:18]–[2018-08-28 Tue 15:43] => 0:25

Explode Bogdan
Sun salutation & bird position Anna
Circle kick Niclas
Zombie Mathias RT
Pulling a rope Patrick
Jumping Camilla
Ballerina spin Aishah
Karate Movement Mathias MC
Hand stand arm spin Laurynas
Circles Franc See below
Worm Movement Andreas
??? Mads

Circles | Franceso

Tracing circles in the air with my arms, slowly, inwards and outwards. Laban effort factors: indirect, light, sustained, bound. Trying to convey a feeling of calmness and balance.

The Rokoko Suit allowed me to explore the space and rethink its boundaries myself: first I had thought of a still position, but then I decided to start walking into the room.

On the other hand, I felt constrained and a bit clumsy from wearing the suit. This led me to perform a heavier movement than I thought.

WAITING TASKS FOR MOCAP

CLOCK: [2018-03-03 Sat 10:07]–[2018-03-03 Sat 10:11] => 0:04

Think of a movement to do {{{perform?}}}

Think of the QUALITY (alternative feeling) you want the movement to have and express. (In the optical MoCap, you will have markers on head, hands, and perhaps legs.)

Write this down on the post-it.

Once you have done the movement, you will be asked to describe how it FELT doing the movement.

We will note your intentions and descriptions of movements done, and analyse.

For instance using: http://biomechanical-toolkit.github.io/docs/Mokka/index.html https://github.com/Biomechanical-ToolKit/Mokka

We are also looking at this repo, as described in [1], but it does't look production-ready: https://github.com/numediart/ofxMotionMachine

OBSOLETE Install MoCap toolbox, read manual, install

CLOSED: [2017-03-09 Thu 09:32]

CLOCK: [2017-03-07 Tue 10:00]–[2017-03-07 Tue 11:30] => 0:03 CLOCK: [2017-03-07 Tue 13:45]–[2017-03-07 Tue 16:30] => 0:03

Data: https://www.c3d.org

installed Max7, sadam library, cnmat tools, also Lobjexts needed: for http://www.uio.no/english/research/groups/fourms/downloads/software/mcrtanimate/index.html Downloaded JAVA from https://support.apple.com/kb/DL1572?locale=en_US to Make MAX5 standalone work

Installed mocca

OBSOLETE MATLAB

CLOSED: [2018-03-03 Sat 09:56]

CLOCK: [2017-03-09 Thu 09:39]–[2017-03-09 Thu 21:05] => 11:26

mcread

TODO Games, AI, and Embodiment   ATTACH

CLOCK: [2018-04-04 Wed 12:27]–[2018-04-04 Wed 12:34] => 0:07

Wwizard THINQ link to CD attachment: https://goo.gl/RtDqpf}

TODO Isbister video and movementbasedgameguidelines.org

CLOCK: [2018-04-03 Tue 10:46]–[2018-04-03 Tue 11:11] => 0:25

That video is too emotion. This CHI16 one is better for guidelines: https://youtu.be/opUK79BJvJI Exploit RISK.

Games: MDA, Brave MUI world, Isbister, Body-centric computing

SimpleAI

CLOCK: [2018-04-03 Tue 14:03]–[2018-04-03 Tue 14:28] => 0:25

The game was developed in the game engine Unity 5 with three scripts:

  1. Player; creating randomly generated obstacles and agents.

  2. EnemyAI; consisting of the behaviour of the agents.

  3. Output; for storing the in-game data.

TODO REFLECTIONS

CLOCK: [2018-04-06 Fri 08:28]–[2018-04-06 Fri 08:53] => 0:25

Kinaesthetic empathy: a human skill?

DONE Embodiment and VR [1/1]   ATTACH VR

CLOSED: [2019-04-09 Tue 08:20] SCHEDULED: <2019-03-14 Thu 09:00-12:00> DEADLINE: <2019-03-10 Sun>

CLOCK: [2019-03-14 Thu 11:31]–[2019-03-14 Thu 11:44] => 0:13 CLOCK: [2018-03-22 Thu 05:55]–[2018-03-22 Thu 07:23] => 1:28 CLOCK: [2018-03-22 Thu 05:42]–[2018-03-22 Thu 05:46] => 0:04 CLOCK: [2018-03-21 Wed 08:19]–[2018-03-21 Wed 08:56] => 0:37 CLOCK: [2018-03-11 Sun 09:01]–[2018-03-11 Sun 09:45] => 0:44

15.3 AM Cumhur

Learning outcomes: after this session, you will be able to

  • understand the three illusions in VR

    • place, plausibility, embodiment

  • understand the bodily skills needed for technological development, decision making, steering, and path finding in VR

  • apply methods and techniques to real world scenarios (VR) and project concepts

  • analyze, compare, and assess the potential of different methods and techniques in order to make the proper design choices in VR

Please check out the links in the MATERIAL before the lecture, and start thinking how to integrate these elements in your mini-project design.

DONE READ New REFS on VR & Performance

CLOSED: [2019-09-11 Wed 08:45]

CLOCK: [2019-03-05 Tue 20:25]–[2019-03-05 Tue 20:50] => 0:25

  • cite:Smith2018_IJPADM:

  • cite:Dixon2006_IJPADM

  • Lahunta, Scott, 2002. Virtual Reality and Performance

  • Gillies:2019:TOCHI: Distinguishes three interaction strategies:

    1. object-focused

    2. direct-mapping (VR)

    3. movement-focused

  • \cite{Spanlang:2014fe}: How to build an embodied lab

    • Embodiment, under certain conditions can make body ownership and agency

Model-based interaction refs:

\cite{oulasvirta2019}: Oulasvirta, Antti. “It's Time to Rediscover HCI Models.” Interactions 26 (2019): 52–56. doi:10.1145/3330340.

Preperation / Lecture Material / Reflections

DONE Preparation

CLOSED: [2018-05-09 Wed 15:32] SCHEDULED: <2018-03-28 Wed>

  • Read the VR-book Chapter 4: Immersion, Presence, and Reality-Trade offs

  • Consider the applications of the design guidelines to your mini-project (Section 5.4), and bring your ideas to the class.

  • Watch the interaction technologies & design examples part of the SIGGRAPH 2017 VR interaction course https://youtu.be/RNypfiiyI8A?t=2h4m10s (only the 3rd speaker )

Lecture Material

CLOCK: [2018-03-21 Wed 09:57]–[2018-03-21 Wed 10:22] => 0:25

VR-book Chapters 4,5, 28, 29.

https://www.gdcvault.com/play/1023649/Human-Centered-Design-of-Immersive

Reflections

What is

  • interaction fidelity?

  • GoGo Technique?

Stefania Serafin, Niels Christian Nilsson, Cumhur Erkut, and R Nordahl. 2016. Virtual reality and the senses. Danish Sound Innovation Network. Retrieved from https://issuu.com/danishsound/docs/dtu_whitepaper_2017_singlepages

Read VR book Section 4
Answer VR book guidelines on Section 5 in relation to miniproject

Assignment @   ATTACH

CLOCK: [2018-03-22 Thu 07:23]–[2018-03-22 Thu 07:48] => 0:25

https://www.moodle.aau.dk/mod/assign/view.php?id=749778

Select an interaction pattern from Ch28 (in lecture material) relevant to your mini-project. Find more examples (videos, projects, code, etc), and the associated guidelines in Ch29, if any.

Describe the interaction pattern as a first-person experience, emphasizing the differences of felt experiences in real and virtual worlds, and if possible in Laban dimensions.

Move to a related pattern until you describe at least three patterns.

Submit your descriptions as pdf.

/GP43NC/smc8-courses-embodied-interaction/src/branch/master/MEDIALOGY/screenshot_2018-03-22_07-23-10.png

DONE Bjørn A. Hansen, Mathias R. Thomsen, Niels Valentin

CLOSED: [2018-03-22 Thu 07:34]

CLOCK: [2018-03-21 Wed 08:56]–[2018-03-21 Wed 09:41] => 0:45 :END

A good investigation of interaction patterns, with application in mind

  • (direct hand) manipulation patterns

    • subtle / simple

    • GM won't look behind: not too much precision or fine control

  • non-spatial control pattern gestures, voice communication for storytell

  • indirect control patterns Users (GM) won't watching the effects while they are “performing”

First persin perspective is good. Laban dimensions missing (but maybe could be added when you's specifiy gestures)?.

Interesting mini-project and references. You're interested in EI + Storytelling, so check this out: http://voicesofvr.com/629-embodied-storytelling-innovations-from-sundance-2018-with-shari-frilot/

TODO EI and Philosophy: How tight the connection should be?   slide

SCHEDULED: <2018-02-19 Mon 09:00 - 12:00>

CLOCK: [2020-03-24 Tue 14:34]–[2020-03-24 Tue 17:38] => 3:04 CLOCK: [2018-02-19 Mon 08:27]–[2018-02-19 Mon 12:27] => 4:00 CLOCK: [2018-02-07 Wed 14:23]–[2018-02-07 Wed 14:48] => 0:25 CLOCK: [2017-02-23 Thu 08:32]–[2017-02-23 Thu 08:57] => 0:25

Debate Instructions: https://www.youtube.com/watch?v=EWfMV_jbiOU example: Berkeley vs Harvard: https://www.youtube.com/watch?v=JhzwSlK4uEc Dartmouth practice: https://www.youtube.com/watch?v=LMO27PAHjrY

Phenomenology:

Pragmatism

Somaesthetics

COMMENT DONE Debate with argumentation

CLOSED: [2018-03-28 Wed 20:01]

In this debate, two opposing groups are formed, based on their stance in respect to

  1. Kristina Höök, et. al. Embracing First-Person Perspectives in Soma-Based Design

Informatics 2018, 5(1), 8; doi:10.3390/informatics5010008 (Main literature)

  1. Antti Oulasvirta and Kasper Hornbæk. 2016. HCI Research as Problem-Solving. ACM Press, 4956–4967. http://doi.org/10.1145/2858036.2858283

The groups will present their own views on the chosen theme (philosophical background) and give counter-arguments to the opposing views (e.g, pyilosophy of science ve somaesthetics). The students Practice presenting justifications and arguments for their own opinions and evaluating other people's opinions. They start with the arguments given in the two papers, with application on their own (previous) designs they want to improve / extend through Embodied Interaction. The goal is not to beat the opponent, but further own understanding.

A chairperson is chosen for the debate, who ensures everyone has a chance to talk. Cumhur and Sofia will help the chairperson, who will also control that the arguments do not last too long. The chairperson lets the opposing arguments in turns. If the debate does not progress, the chairperson may also give the teachers a chance to present a stimulating argument to further the debate. If it succeeds, the dabate will force the participants to analyse their opinions.

Other philosophical topics
  • Phenomenology from Svanes

  • Embodied cognition from Kirsh

Other problem solving topics

TODO Miniprojects

CLOCK: [2020-03-20 Fri 09:24]–[2020-03-20 Fri 09:24] => 0:00 CLOCK: [2018-02-17 Sat 17:49]–[2018-02-17 Sat 18:14] => 0:25 CLOCK: [2018-02-17 Sat 08:16]–[2018-02-17 Sat 08:27] => 0:11

DONE Final Event with Skeleton Conductor + MuX

CLOSED: [2018-02-02 Fri 16:54]

CLOCK: [2018-02-02 Fri 12:50]–[2018-02-02 Fri 13:54] => 0:04 CLOCK: [2018-02-02 Fri 16:50]–[2018-02-02 Fri 16:54] => 0:04

HTC VIVE opens a stage with drums, keyboard, rythm sequencer. Intrument have I/O controls, and spatial sound – main menu has different options for rendering, including mono. The app runs on win and mirrors to desktop. Opening the Unity binary with left-shift pressed brings out input choices. But I could not map the HTC VIVE controls to keyboard / unit controls. Hence, the app will start up without an HTC, but could be interacted with.

We could try out VRTK with the source code, or look into forks if anybody did that before. We could also check how to add new instruments. And learn how to jam with it. The headphone cable to VIVE is too short. MedifaceUSB drivers of the WFS64 are not running on the VIVE computers.

w