sharing our experiments...
Tiny Tesseracts is the first collection of 5 dimensional generative avatars in all of cyberspace. Developed by T&DAs Technical Artist Sean, each tesseract is dynamically animated in a seamless loop and ready for VR, AR, the metaverse, and beyond!
The tesseracts and their respective metadata sets were created through our bespoke 5D generative algorithm. There are 5 waves of tesseracts minting over the span of 5 weeks - Brutus, Cryost, Zebron, Nexus, and Neotronus.
Welcome to the next dimension.
Minting now on OpenSea
ELEPHANTOPIA | NFT & BLOCKCHAIN GAMING PROJECT
Elephantopia is the first step into the web3 and NFT world for T&DA. We're making a game world with the idea of eventually having dozens of different game modes and activities all within the same world.
An initial set of Trunket profile image NFT's help establish the concept, and garner interest. These NFT's will become the access key to the 3D game world we're working on, currently we've started on a racing segment but over time there will be many different things to try as more gets built...
Soul Trader | Virtual Trading Card AR Filter
For the latest #FacebookHackathon, we have designed a filter that explores a future in which our soul can be captured, shared, and commodified into blockchain trading cards. This is a future where our identity can be exchanged for virtual coins with just the swipe of a finger.
T&DA presents Soul Trader.
#arfilters #sparkar #facebook #hackathon
Let's get this ball rolling!
Players roll a virtual marble through a hand-crafted obstacle course, collecting coins and knocking down walls to reach the final flag, but here’s the catch - they all control the same ball!
Think text adventures meets marble run, players influence the direction of the ball by submitting comments into the Live Chat (such as ‘charge forward!’ or ‘please mr marble, roll right’).
Every 10-30 seconds the commands are tallied, averaged, and normalised, and the marble is fired in that chosen direction.
By utilising Epic's Unreal Engine we provided high quality visuals and realistic physics for this live experience.
We hope to deliver more #Unreal live games that place the power in the players. We see potential to further develop ideas around this tech, such as pitting fans of competing teams, or groups with polarising opinions, and have them come together to achieve a common goal, or defeat each other and vanquish the enemy!
Tetris-U | Full Body Tracking with Machine Learning
By harnessing the power of machine learning to track one's entire body, T&DA has created an experimental Tetris clone.. but with a twist!
To quote Roger Walters: "all in all you're just another brick in the wall".. Introducing Tetris-U! Where the player becomes the Tetris bricks by using a webcam.
The player captures real life poses and converts them into stacking bricks. These poses can include twisting and contorting ones body like a dancer, or as if possessed by unnatural forces, or just creating shapes like a budding mathematician in a desperate search to find the next perfect shape.
You can play this while practicing yoga, dancing or any kind of workout!
Watch the demo video to see some Tetris-U shenanigans.
#machinelearning #tetris #ai #unity3d #mlagents #poseestimation #gameification #gamedev #rnd
EnviroCube | Manipulating Virtual Environments with Physical Objects
You've heard of Darkseid's Motherboxes or Thanos's Infinity Stones...well T&DA want some power of it's own so have conjured the EnviroCube!
We have successfully digitised a lolly jar by attaching a #quest controller and recreating it in a virtual environment.
When our virtual camera targets a certain face of the EnviroCube the world transforms into that new environment, be it desert, forest, snowy mountains any many more.
Simply put, each face of the cube can trigger various events to occur.
From art gallery installations to futuristic remotes, to abstract puzzles, this tech has boundless potential waiting to be discovered.
It's wizardry & these cosmic powers are within your reach by harnessing the power of the EnviroCube!
"Fish Fingers" | Experimentation with Hand Tracking
Here at T&DA we have been experimenting with our #manus gloves as a focal element for environmental storytelling. In this video we show how this real-time tech can be adapted to an underwater scene where we control a swarm of fish at the tip of our fingers.
Swim fishies swim!
Ever since the words "help me Obi Wan..." were uttered, tech geeks everywhere have wondered how can we leap from our Jetson's Zoom Calls to communicating via Holograms and 3D real-time representations of ourselves.
In the T&DA offices we have reached closer to the future.
We call this technology Shrink-Ray after our very own Creative Director Raymond. Here you can see that big-Ray is puppeteering a mini-Raymoji ‘Shrink Ray’ in real time and able to communicate as a 3d character via an iPhone camera's AR.
We have done this by successfully uploading real-time mocap data to the cloud. This means that anyone anywhere around the world can access this data in AR and have a 3D person to interact with live.
Huge potential for performance shows/live events, and the future of communications…
“Office Overgrowth” - Realtime Forest Layout with LiDAR
In this experiment, T&DA has used Epic's Unreal Engine and Quixel Megascans to construct a realistic forest environment. Using the power of LiDAR technology, we collected accurate measurements of the T&DA office environment.
Taking this LiDAR scan, we then created our forest environment to the exact proportions and measurements of our office space, overlaying virtual rock formations and trees onto real world objects.
In this way, we have the ability to transform any real world space into any virtual environment. Combining our previous experiment, we can manipulate this environment to be an interactive experience where users can walk around hybrid virtual worlds, blending the lines between real and virtual universes.
"The Upside Down" - Immersive AR using LiDAR
Harnessing the real-time capability of Unreal Engine, T&DA has rewritten the rules of reality to manipulate the form, interactivity, and atmosphere of any physical space. We have used the Oculus Quest as a virtual camera in Unreal Engine to explore and interact with a LiDAR scan that is mapped onto the real world.
Understanding the measurements of any physical space has huge potential as it allows us to transform your house or office into dreamscapes, theme parks, space stations, jungles and so much more
"Face the Jungle" - Realtime cinematics controlled with your face.
Using Epic's Unreal Engine, and Live Link, our very own Sean Simon created a 360 immersive virtual world, which we can navigate through with nothing more than your facial features.
We can dynamically light a scene by opening our mouths, we generate particles with a raise of an eyebrow...brings a new meaning to, when you're smiling the whole world lights up with you!!!
This is only one small example, of the true nature of what Epic Games Real Time software can offer us creatives. We will be showing more soon, as we reveal some of the wonderful work we are excited to be a part of.