/ LABS

sharing our experiments...

Tetris-U

April 1, 2021
Tetris-U | Full Body Tracking with Machine Learning

Tetris-U | Full Body Tracking with Machine Learning

By harnessing the power of machine learning to track one's entire body, T&DA has created an experimental Tetris clone.. but with a twist!

To quote Roger Walters: "all in all you're just another brick in the wall".. Introducing Tetris-U! Where the player becomes the Tetris bricks by using a webcam.

The player captures real life poses and converts them into stacking bricks. These poses can include twisting and contorting ones body like a dancer, or as if possessed by unnatural forces, or just creating shapes like a budding mathematician in a desperate search to find the next perfect shape.

You can play this while practicing yoga, dancing or any kind of workout!

Watch the demo video to see some Tetris-U shenanigans.

#machinelearning #tetris #ai #unity3d #mlagents #poseestimation #gameification #gamedev #rnd

EnviroCube

March 24, 2021
EnviroCube | Manipulating Virtual Environments with Physical Objects

EnviroCube | Manipulating Virtual Environments with Physical Objects

You've heard of Darkseid's Motherboxes or Thanos's Infinity Stones...well T&DA want some power of it's own so have conjured the EnviroCube!

We have successfully digitised a lolly jar by attaching a #quest controller and recreating it in a virtual environment.

When our virtual camera targets a certain face of the EnviroCube the world transforms into that new environment, be it desert, forest, snowy mountains any many more.

Simply put, each face of the cube can trigger various events to occur.

From art gallery installations to futuristic remotes, to abstract puzzles, this tech has boundless potential waiting to be discovered.

It's wizardry & these cosmic powers are within your reach by harnessing the power of the EnviroCube!

Fish Fingers

March 18, 2021
"Fish Fingers" | Experimentation with Hand Tracking

"Fish Fingers" | Experimentation with Hand Tracking

Here at T&DA we have been experimenting with our #manus gloves as a focal element for environmental storytelling. In this video we show how this real-time tech can be adapted to an underwater scene where we control a swarm of fish at the tip of our fingers.

Swim fishies swim!

Shrink Ray

March 18, 2021
Shrink Ray

Ever since the words "help me Obi Wan..." were uttered, tech geeks everywhere have wondered how can we leap from our Jetson's Zoom Calls to communicating via Holograms and 3D real-time representations of ourselves.

In the T&DA offices we have reached closer to the future.
We call this technology Shrink-Ray after our very own Creative Director Raymond. Here you can see that big-Ray is puppeteering a mini-Raymoji ‘Shrink Ray’ in real time and able to communicate as a 3d character via an iPhone camera's AR.

We have done this by successfully uploading real-time mocap data to the cloud. This means that anyone anywhere around the world can access this data in AR and have a 3D person to interact with live.

Huge potential for performance shows/live events, and the future of communications…

Office Overgrowth

March 18, 2021
“Office Overgrowth” - Realtime Forest Layout with LiDAR

“Office Overgrowth” - Realtime Forest Layout with LiDAR

In this experiment, T&DA has used Epic's Unreal Engine and Quixel Megascans to construct a realistic forest environment. Using the power of LiDAR technology, we collected accurate measurements of the T&DA office environment.

Taking this LiDAR scan, we then created our forest environment to the exact proportions and measurements of our office space, overlaying virtual rock formations and trees onto real world objects.

In this way, we have the ability to transform any real world space into any virtual environment. Combining our previous experiment, we can manipulate this environment to be an interactive experience where users can walk around hybrid virtual worlds, blending the lines between real and virtual universes.

The Upside Down

March 18, 2021
"The Upside Down" - Immersive AR using LiDAR

"The Upside Down" - Immersive AR using LiDAR

Harnessing the real-time capability of Unreal Engine, T&DA has rewritten the rules of reality to manipulate the form, interactivity, and atmosphere of any physical space. We have used the Oculus Quest as a virtual camera in Unreal Engine to explore and interact with a LiDAR scan that is mapped onto the real world.

Understanding the measurements of any physical space has huge potential as it allows us to transform your house or office into dreamscapes, theme parks, space stations, jungles and so much more


Face the Jungle

March 18, 2021
"Face the Jungle" - Realtime cinematics controlled with your face.

"Face the Jungle" - Realtime cinematics controlled with your face.

Using Epic's Unreal Engine, and Live Link, our very own Sean Simon created a 360 immersive virtual world, which we can navigate through with nothing more than your facial features.

We can dynamically light a scene by opening our mouths, we generate particles with a raise of an eyebrow...brings a new meaning to, when you're smiling the whole world lights up with you!!!

This is only one small example, of the true nature of what Epic Games Real Time software can offer us creatives. We will be showing more soon, as we reveal some of the wonderful work we are excited to be a part of.

/ Sizzle Reel