• UI/UX Design
  • Interactive Installation
  • Visual Design
  • About
  • Connect
Menu

YanJun Lyu

A portfolio of visual, research and digital
  • UI/UX Design
  • Interactive Installation
  • Visual Design
  • About
  • Connect

 

" Any sufficiently advanced technology is indistinguishable for magic."     - Arthur C. Clarke

HumInterFace 2016

The title, HumInterFace, consists of “Human and Interface,” a team project conducted in Tangible Interface class at MIT (2016).

Team Members: Tanuja Mishra, Jeremy Finch, Daniel Levine, Yanjun Lyu.

Currently, we are creating a pipeline that enables multiple users to move and together control a Sphero robot. We proposed a new approach to physical remote communication, based on shared workspaces using human movements and object manipulation. Our current prototype is to manipulate a physical ball, Sphero. (It is an app-controlled robot ball, which is wirelessly controlled with Bluetooth). By connecting the universal sensor (a phone) to the body and connecting the phone via Wi-Fi to a server, users could manipulate the ball through body movements. Each of the users could change the ball’s movement (speed and direction) by tilting their bodies. Each part of the body---the head, arms, and legs---is able to manipulate the direction of the object, such as by moving your head to the left, the ball goes to the left, and by raising up your left arm, the ball will jump. Multiple users could unlock the ball’s functionality.

Those four people in our group, and each of them could change its functionality including rotation, color and height of the bounce.

The pipeline has 3 parts:

1. A separate phone or tablet acts as a receiver for the websocket, converting the acceleration data into motion. The app on this device is utilizes the Sphero API to send commands to the Sphero robot. Left and right body tilt motion relays into left and right turning for the Sphero. Forward learning corresponds to a driving motion for the Sphero. A sharp backward lean changes the forward direction of Sphero (180 degree spin). Each person’s movement changes the color of Sphero;

2. The server provides a websocket connection for all the connections to flow through. The way it behaves is analogous to a chatroom for commands: a pool of everyone’s data. The receiver app records everything that flows through this “chat room” and pulls out each command by person identifier.

3. The apps are written in Swift using XCode for iOS, and the webserver is written in Elixir using the Phoenix Framework, hosted on Digital Ocean. The apps incorporate the Sphero API and SwiftPhoenixClient API to relay to the server. 

HumInterface
User Testing
User Testing
Screen Shot 2017-01-02 at 2.30.58 PM.png

SleepScape 2016

Monitors your sleep states and learns your habits, adjusting to your sleep routine over time.

MIT, Tangible Interface Class, 2016.

Team Members: Yanjun Lyu, Joey(Wei) Xu, HongLiang Wang, Kristin Osiecki. 

SleepScape.png
Screen Shot 2017-01-13 at 11.30.56 AM.png

Communicational Plant 2016

Physical Tree

The real tree as a communication medium helps people who are apart from each other to build an emotional connection.   A metaphor is “A direct comparison between two unrelated or indirectly linked things.” The function of the object is to embody the imagination and subtle emotion in helping people solve communication problems when verbal expression is involved.  The metaphorical object also comes from their shared memories. Thus, the plant is a metaphorical object between my grandfather and me. 
 
There are three types of user scenarios that I designed. The first one is when my grandfather touched the soil with his hand - it made him understand whether it was dry or not and how much water it had. These are physical measurements. Then watering his plant, there is a corresponded signal light in my location, which made with NeoPixel Digital RGB LED Strip. Different colors convey different meanings in different scenarios. 

 


The second level is a warning message. If one of us doesn't care for the plant for a long time, about two weeks for Lily, both of us will receive a warning message and vibration signal that the plant makes. For example, if the color sensor captures my grandfather's leaves, it will send data to my phone with a message, "CALL HIM RIGHT NOW "and make my plant vibrate.  Similarly, if the color sensor captures the data from my plant, it will send data to my grandfather's phone with a message, which is  "What's going on." 

The third scenario is blooming. We both have plants, which are called Lily,  and which only bloom from April-May. The prototype is based on a camera- made, which can reflect my flowers to my grandfather's plant. similarly, when my grandfather's plant gets a flower, I also can see the flower in my location through the camera. 

Some pictures show the research process, such as how to get the physical data with a moisture sensor, and the color sensor is able to measure if the leaves are healthy or not. 

DSC05443副本.jpg
DSC05438.JPG
Screen Shot 2016-11-14 at 7.20.51 PM副本.png
Camera capture the color of leaves
Camera capture the color of leaves
IMG_3163.jpg

Haptic Dialogue 2016

A wearable glove:

The human hands, as well as being a principal vehicle of motor activity, is the main device of the fifth sense, touch. Human touch is a fundamental physical and emotional development. Touch as a function of the hand receives and manipulates information from the everyday objects of the world, which in turn offer a sort of feedback and response to human hands as people connect with these objects. Thus, I created a prototype of gloves, which is manipulated by some simple gestures, and also as a kind of communication language for people, who are far apart. People receive signals through the different biofeedback, the raising process of temperature, vibration, squeezing, and a visual LED light.  

 

User scenario:

The glove as communication medium helps people transmit "hand temperature" to another. It is devoted to helping people directly sense the warm feelings of their family, likes they are still by your side. Blooming kisses are a type of pre-programmed gesture, which sends the signal " I miss you" to a friend, who lives in another city. A Friend will receive the signal through the raising of the temperature of the glove. People feel like they can directly touch their close friends' hands. Second, the waving hand gesture of good-bye, means " let's stop our dialogue", A receiver will understand through the vibration of the glove. Third, the glove can record and playback the message that the receiver missed. When the receiver puts the glove on,  they can feel the message from the glove.

Screen Shot 2017-01-02 at 3.33.12 PM.png
IMG_2079.JPG

Metaphorical Communication -Virtual Plant 2016

user scenario 

The virtual plant growing through the programed software, Processing.  People can use three simple hand gestures that you usually used for talking care the plant in our daily life, to manipulate the form of plant growing, such as blooming and wither away. I gave up leap motion controller that I used for last project - virtual object, instead of some sensors, blending sensor, tap sensor and temperature sensor. Those kinds of sensors connected with Auduino, which will help me to connect with MAX on computer.

The difference between virtual object and virtual plant is that virtual plant is endowed with a metaphorical representation for my grandfather and me. My Grandfather placed his love to the plant, hope me to grow very well like the Plant that you see. The second, to compare with virtual object, the virtual plan asks users to use three different gestures to play with it rather than just single gestures. 

IMG_1994.jpg
Growing with hand take care
Growing with hand take care
Withering without hand manipulation
Withering without hand manipulation
blooming - taking care of it together
blooming - taking care of it together
Screen Shot 2016-08-25 at 22.10.08.png

Virtual object 2015

Stage 1, Virtual object

"The direct comparison between two unrelated or indirectly linked things is called a metaphor. They can add impact or can help you explain a difficult concept by association with a more familiar one." 

The virtual object is a metaphorical expression, which convey emotional signal through physical design attributes such as their form, color, or materials. In other word, we can readily perceive information and emotive expression through the form of virtual object, which manipulated by hand gestures. 

The case is concentrated on the form changing of the virtual object. As the graphic designer before,  I designed a various modes of object expression that manipulated by different emotional gestures. The variable form of object represent different meanings that users easy to talk with another person with it. such as, the form of excitement, love, sad and so on. 

The purpose of the object is service for distance person, or for someone verbal communication is not a easy way to interact or express subtle emotion to another. 

Object communication is, therefore, the method that I will use to create the emotion bridge between individuals who live great distances from each other, and enhance the emotional connection. 

 

 

Tools: Max, LeapMotion Controller, and MAYA, 3D Printer.

Screen Shot 2017-01-02 at 3.40.28 PM.png
IMG_3177.jpg
IMG_3171.jpg
Video Demo
Controlling the sound with LeapMotion

 Try to use LeapMotion controller to manipulate the sound as the first step exploration of communicational object

IMG_3229.jpg

Motional ATM 2015

Mo-ATM project is about human-machine interaction. Users interact with the machine through body languages and facial expressions. Each body language user is given a function message, and the ATM machine will recognize the user’s function message. For example, there is a dynamic motion picture on each button. Users follow the motion on each button in order to withdraw cash or deposit cash, etc. In addition, each facial expression has a sentiment, which is users’ Password (PIN) of ATM. Users play a series of facial expressions (passwords) to the camera of the ATM in order to go into personal interface. 

 

Keywords: Nonverbal communication (body movement, facial expressions, eye-contact, posture); Human-machine interactions 

Non-verbal communication
IMG_1125-3261.JPG
Screen Shot 2016-08-26 at 16.17.46.png
Screen Shot 2016-08-26 at 16.20.04.png
Dynamic interface
Dynamic interface
The first prototype
The first prototype
The second prototype
The second prototype
The sketches of user scenario
The sketches of user scenario

You Are Here 2014

The project was called " You are here" . The year, 2014, is the first year that I lived in Boston. Thus, I recorded  and compare the difference between Beijing and Boston, such as building, food, Language, Menu, etc. 

 

 

688ce6_a09029c632aa45edbd99b89977c7773b.jpg
YOU ARE HERE

Interactive Game 2013

IMG_1229.JPG
Sound-controlled Game
Screen Shot 2016-08-29 at 11.16.08.png
1-1.jpg
Screen Shot 2016-08-29 at 11.16.39.png
Design and draw the background color
Design and draw the background color
IMG_7276.JPG
prev / next
Back to Interactive Installation
HumInterface
3
HumInterFace 2016
5
SleepScape 2016
7
Communicational Plant 2016
IMG_2079.JPG
2
Haptic Dialogue 2016
5
Metaphorical Communication -Virtual Plant 2016
9
Virtual object 2015
Non-verbal communication
9
Motional ATM 2015
3
You Are Here 2014
8
Interactive Game 2013

Powered by yanjun lyu