Projects 2008 > Physical Cartooning > Journal
Ideas
The technologies we are considering form the basis of what was regarded as two elements of animation (movement and interaction) and how we could explore those elements and replicate them within a physical framework. With the timeframe, it was felt that the technologies could be developed to give a successful proof of concept with a final showcase technology being unveiled.
It was agreed that further research and development could uncover other properties that would add to potential portfolio of pervasive techniques to perform physical cartooning.
Delivery concepts for our pervasive techniques…
Digital Portals
The general process involves forcing the user into a particular controllable situation to allow successful simulation of real world events using digital screens and projections. For example, a microscope reduces the user to only one eye, so they would not expect 3D representations. Brain the Plankton (Creature Comforts) was looked at to become the Worlds Smallest Host to a visitor centre for instance and character appeared as a product of this idea because of the simplicity of animation. Something like a periscope that forces both eyes into a controlled and contained space could also simulate 3 dimensions using a Stereoscopic output. This would be perfect to allow cartoon characters to be layered into live feeds.
Digital Surroundings
Methods were also discussed of how to influence the surroundings using projection/back projection and display systems that are hidden and found in unusual spaces that are taken for granted such as shop windows, letter boxes and drains. After working through a number of possibilities, a magic mirror was settled up giving the impression of a morphing experience that would replace the users face with that of an animated character, controlled by iris recognition. It was decided to prototype the Iris tracking as it was felt that there was sufficient development and control that could be associated with this type of human computer based interaction.
Digital Puppeteer
The use of various technologies to allow real actors to control and manipulate digital ‘puppets’ were discussed and assessed as viable under the current research and development framework. Techniques included voice recognition and analysis to control a character’s mouth, in an attempt to lip-sync a character using live inputs. Utilising a series of embedded screens, animatronics and subtle controls, it was discussed that it would be possible to create a personal puppet outfit, allowing an actor wearing it to bring small puppets out into the audience. Imagine a small mouse popping in and out of an actor’s top pocket, before running around his back under his coat and out of another pocket. The system would use a number of small, localised areas (pockets, hats, sleeves, coat lining) connected to a laptop wirelessly. Very ambitious although a fusion of man and machine, if well thought out, would give the impression of a cartoon alive in the real world.
Comments