Wednesday, September 15, 2004

yoichiro xinwei chat re flow, visual effects 040910

Date: September 10, 2004 9:45:08 AM EDT (lightly edited by sxw)
Xin Wei + Yoichiro Serita
...
Xin Wei Sha: good to see you -- what time is it?
Yoichiro Serita: it's.. 10:15pm @ tokyo
Xin Wei Sha: i hope you're doing okay with work, and am looking forward to seeing you in rotterdam!
Yoichiro Serita: yes, me too!
Xin Wei Sha: delphine has made a very nice extension of the fluid jitter external so that it can now take optical flow as vector field
Yoichiro Serita: wow, that is a great news. I'm really looking forward to see that!
Xin Wei Sha: i encouraged her to communicate with you and give you the external when it's debugged and neater -- maria cordell is working on the max interface. that way you can use it too
9:20 AM
Yoichiro Serita: coool! do you have some captured image?
Xin Wei Sha: mm not me, but she and maria do -- maybe you can email them directly for some qt video
Yoichiro Serita: sure, i'll do that-
Xin Wei Sha: and joel would like to even use your jitter results as inopout for his sound synthesis
Xin Wei Sha: could be very interesting. so then we can think more about
Xin Wei Sha: the visual response to people's movement.
...
Xin Wei Sha: excellent -- whatever is inspiring for you ! the one extra comment is that if we light the space with strong downward pointing lights, we could isolate certain areas in "cones" of lllumination thatthe cameras would pick up
Xin Wei Sha: ie people who happen to be in those cones of light would have their motions picked up more by our system,
Yoichiro Serita: hahaha interesting!
Xin Wei Sha: so that's something to think about -- it would make people sensitive more to their surrounding space, than just focus on staring at the membrane screen -- which could be very good, what do you think?
Yoichiro Serita: yes, I agree! this time, my primal concern is "flow", so..
Yoichiro Serita: i want to use membrane for people to be more sensitive to the environment
Yoichiro Serita: the idea is yet abstract, though,,
Yoichiro Serita: but anyway
Yoichiro Serita: i don't want people to "stop and see" membranes, too much
Yoichiro Serita: this time
Xin Wei Sha: yes we agree on this
Yoichiro Serita: ok, thanks! any ideas you've already come up with would be a great stimulation for me- so please email me when you have time-
Yoichiro Serita: also,
Yoichiro Serita: well,,, by the way
Yoichiro Serita: where should i go on 17th when i arrived rotterdam?
Xin Wei Sha: we all should agree on that location. it could be v2 office which is easy to find in rotterdam. but chris will work that out with v2 for us.
Xin Wei Sha: questions re flow:

how the visual effects should change over time, and in response to, for example, varying density of bodies, and to rate of flux of bodies past the cameras.

and also how the 3 screens could echo, or relate to each other.
it would be interesting if somehow some "antiphonal" or echo relations between the membranes could do what you like -- make people sensitive to the spaces in between rather than the screens themselves... the simplest could be time delay of effect, that could vary.

and how to use journaled past activity, maybe build up a layer that recalls/echoes past activity that becomes more visible when there is less current activity? that way if people congregated near a location in the past, that location would be registered somehow. but we cannot expect the projected video to register with what a person sees through the physical screen.
Xin Wei Sha: just for the record :_
Xin Wei Sha:
Yoichiro Serita: i'm reading..
Xin Wei Sha: i'll record this chat now and go to my office (in cambridge) now to write up notes for you, ok?
Yoichiro Serita: great, thanks!
Xin Wei Sha: maybe if as you and i get a chance to walk past lots of people in this week, or walk past nature -- the sea perhaps ! -- this will generate flow ideas, too.
Xin Wei Sha: do ask delphine and maria to send the external and videos when they're ready...
Yoichiro Serita: yes! i agree the idea!
...

0 Comments:

Post a Comment

<< Home