Showing posts with label 2d tracking. Show all posts
Showing posts with label 2d tracking. Show all posts

Thursday, September 2, 2010

Five*1





AE Motion tracking WIP.
Exploring motion tracking in After Effects. The base unaltered track that it gave is the top video. As far as a stable track it was rubbish, but that's to be expected when you haven't tweaked the settings what-so-ever. Mind you, it kept it in the general area of the tracking point reasonably ok-ish so that was enough for me to work from. I linked the position of the comped character through the use of an expression to link it with the tracking data. That way when I changed the track layer the character's position would be automatically updated. Sure I could have copy and pasted the relevant data but this was just far more efficient.

The second video is a more refined track, but as you can see it still slips here and there, particularly with the larger camera movements.

The final video is that same track again, tweaked further - with touch up's done by hand. It was a bit painstaking to do it this way but still worked out pretty well. It's still not perfect, I'm going to continue to play with it to see if I can get it perfect - but this isn't too bad as far as it goes I don't think.

I have done a few other tests which I will upload as well, but this was the best one for this particular example. So after this little exercise I've realised that the challenge of tracking into unstable footage for the length of time planned and the complexity of what I will be doing would be... well suicide. I'm pretty sure it would be doable, but would not look as good as it would if it were stable footage. There will still be some handheld footage in the film, but the rest will be shot on a tripod and manipulated into appear as though it is handheld. I've thought of a few different ideas on how to approach this (keying, plugins etc) I think the route I will explore further is using handheld footage as a direct base for simluating the movement. Using the same technique that I did in the examples above and applying the tracked movement to a stable piece of footage. I'm yet to shoot any stable test footage but I've quickly mocked up an example using the movement of the footage above on a still image. I think as long as the footage which is to be used is of a higher res than the final export it should work. Tests will confirm this of course and it may mean double shooting everything that I want as handheld, but that's alot less time-consuming than the alternative. If not, I'll use a mixture of the techniques. (Video eg follows, ran out of my daily vimeo quota =p)


Monday, May 10, 2010

Two*8

WRITTEN COMPONENT ENTRY Avatar VFX notes.
According to James Cameron (& colleagues), Avatar is a game-changer for the way that VFX movies are made, watched, discussed and written about.
The virtual cinematography workflow which allowed Cameron to directly observe how an actor’s CG performance would interact with the CG Pandora environments in realtime on an LCD monitor (on-set), furthered the process infinitely. Rob Legato developed the workflow, and it is revolutionary due to its allowance for the firector to interact with CG elements as if he/she were shooting live action. The arts of digital and live action movie making, as well as the ideas of pre and post-production are becoming more and more merged. The paradigm of “5D” apparently a reality.
The ability to make key creative decisions within the digital environment is something that will alter the industry and its roles for the next few years. The access to ‘explore’ this digital realm, which he was responsible for, allowed Cameron to use his filmmaking practically in a virtual set, which is something, which has not been possible before Avatar. Cameron scouted these virtual sets, as he would have a physical environment.
Rob Powers was responsible for the development of the virtual environment; allowances for Cameron to alter foliage layouts, day/night shifts in the moment. Another technique allowed for the creation of an environment ‘sphere’ which contained a never-ending environment which was set to a certain radius so that the rendering demands on the software were not excessive – so the realtime render engine could keep up with matte paintings, and the organisation of all these elements was completely configurable.
Creating such a reality was only possible through the precise manipulations allowed by a computer. Such creatures and world were simply not possible any other way because they could not exist in reality, or be expressed as efficiently though the use of prosthetics or the like.
A new compositing system was devised for 3D; placing depth information on everything that was being rendered, which will become the standard for compositing because of the flexibility it allowed for, according to Joe Letteri “Even if you’re doing a non-stereo movie, it’s just easier to composite in 3D than in 2D”.
One breakthrough that is highlighted in the article (aside from those already mentioned) was the explosions. The need for the explosions to interact with CG objects required that the explosions themselves be CG. Chris Horvath (who was instrumental on the fire fx in Harry Potter and the Half-Blood Prince) was responsible for the shading. The explosions used the same fluid simulation engine used for ‘The Maelstrom’ in Pirates of the Caribbean: At Worlds End. There were modifications to the engine so that it behaved appropriately.
Letteri states that there will be a lot learned from the experience(s) of Avatar, “I think what everyone discovered as you went along is that if you’re going to put a virtual stage together like a live-action shoot, then this becomes the front end to a visual effects piece. Because you not only start thinking in terms of takes and selects, but as shot design. You have to be able to switch from one to the other. And it requires a level of infrastructure for the whole thing that I think is going to benefit everyone if we can come up with some system across the board to make that easier”.

http://www.awn.com/articles/visual_effects/avatar-game-changer/page/6%2C1

Saturday, May 8, 2010

Two*4

WRITTEN COMPONENT ENTRY Tracking Tools notes.

Summary of 2D Tracking Tools;

Adobe
- After Effects is one of the most-widely used effects solution.

Aivd
- AvidlDS and Nitro vfx tools allow for advanced compositing and tracking. DS Nitris is similar to Photoshop and After Effects. Nitro has powerful motion tracking, keying, compositing and colour correction tools.

Discreet-Advanced Systems
- Includes Flint, Flame, Inferno, Fire and Smoke. All have 2D Tracking. Inferno & Flame have 3D Tracking. Lack multi-point solutions. Lack Maya export.

Discreet – Desktop
- Paint & Effect, Paint offer Illustrator-like solutions. Effect offered compositing capabilities. These 2 programs were merged to create Combustion.

Curious gFx Pro
- Can import, composite, track and stabilise footage.

Digital Fusion
- Was at one point provided with Alias 3D.

Mirage
- Unified software allowing for the development of animated graphics and sfx.

Quantel
- Code-based comprehensive set of 2D tools.

Shake
- Shake is 2D-only. Accepts 3D tracking of complex camera moves and Shake data.
Summary of 3D Tracking Tools;

3D-Equalizer
- Original 3D tracking tool. High-end.

Boujou
- Cant track many things better.

Monet
- 4 point tracker that uses planes in place of points. Developed specifically for Cinesite to track the paintings in Harry Potter and the Prisoner of Azkaban.

REALVIZ MatchMover
- Incorporates both automatic and supervised tracking, without limit in combining both.

ras_track
- Academy Award winning software.

TRACK
- Digital Domain’s award winning software, works with Nuke (compositing software).

The PixelFarm
- PFTrack at the core of the program. Extracts complex 3D and 2D data from footage. Has a number of applets to utilise data.

Maya Live
- Autodesk’s (previously Alias) 3D camera tracking program, widely known for producing accurate z depth solutions.

Two*3


WRITTEN COMPONENT ENTRY Art of Tracking notes.
Tracking is the process of automatically locating a point (or series) frame-to-frame in a sequence, allowing the user to stabilise a shot, track to or solve object or camera movement. The process started as one point tracking which could stabilise a shot or add matching motion to a composite, but today it involves complex 3D camera solutions and extends to optical flow – the technology tracking every pixel.
By understanding the process it is possible to improve solutions and speed up work. This includes the inner algorithm’s that drive the software. There should however be precautions taken when filming so that tracking is not necessary, or a less complicated task. Doug Roble (Technical Academy Award winner) for his TRACK software at Digital Domain; “We go the other way, we measure everything on set, we are really exact. 20% of the shots we solve are easy, 80% are hard. We focus on changing those odds”.
Historical Overview
Prior to digital tracking, most effects shots required the camera to be locked off, it was impossible to track without motion control of the capturing cameras to align to shots in post. Hand tracking has been attempted but the eye isn’t precise enough to capture every movement.
The US Defence Department initially developed the concept of tracking for use in missile guidance systems, the earliest vfx application was in 1985 at NYIT Graphics Lab by Tom Bringham and J.P.Lewis, implementing a FFT-based tracker used for a number of television shows. ILM had an early 2D Tracking software called ‘MM2’, which was used on Hook. The tool was a manual tool which could alter the positions of the keyframes. ILM used this software as a base for their later technology used on Jurrasic Park which was the first 3D Tracking Software – developed by J.P.Lewis and Rob Bogart.
ILM allowed Lewis to publish the algorithm involved, which has also contributed the algorithm to other software packages Shake and Commotion.
Joe Alter developed one of the earliest markerless tracking system which was used for the morph plates on Star Trek The Next Generation.
Discreet first bought tracking software to the wider vfx community via ‘Flame’. Flame was initially ‘Flash’ – written in Melbourne, Aus by Gary Tregaskis. Flame was launched at Siggraph 1992. Flame was first used in feature film work on Super Mario Bros (1992).
“There’s a balance between a perfect track that takes forever and on interactive track that falls off from time to time. We decided to go with interactivity and the real thought went into making it easy to use, making it easy to stop and restart a track because it was never foing to be perfect”.
            - Colin Wrey (first implementation of Quantel’s Auto Lock Follow [ALF])
Action was a general tracking tool which really allowed users to make the most of the technology. In the early days tracking a 30 second shot was a 30 minute batch process with no guaranteed success.
Shake was immediately popular because it offered the ability to work in a higher bit depth than anything else on the market. A very early version of the software was used on the opening shot of Titanic (a deep underwater shot) which had originally been filmed as an 8 bits/channel of colour info but slight tonal gradiations brought massive banding problems. Shake was able to glean a clean artifact-free shot.
In Flame/Fusion Version 5 (1997) you could now track as many points as you want, automatic triangulation stabilisation, corner tracking, perspective tracking.
In 1998 was when tracking began to be acknowledged by industry awards, the Technical and Scientific Academy Award to Gary Tregaskis (primary design) and Dominique Bolsvert, Phillipe Panzini and Andrew LeBlanc for the development and implementation of Flame and Inferno. Douglas R. Roble won the Technical Achievement Award for contributing to tracking technology and design and implementation of the TRACK system. TRACK is an integrated software at Digital Domain which uses computer-vision techniques to extract 2D & 3D information about the camera and the scene. TRACK feeds Digital Domain’s Nuke system with full 3D camera solutions. TRACK shift from 2D to 3D tracking was a reflection of the industry’s shift to 3D.
Luc Robert commented, “The main benefit from match-moving software is a marked increase in productivity. Automatic match-moving is a definitely a huge help in this respect, since on a significant fraction of shots it produces a solution which requires no human interaction at all. Most post-production companies using automatic match-moving have integrated it into their pipeline so that, for virtually no overhead cost, they can benefit from these automatic solutions. Despite this, there will always be shots which automatic match-moving software will not be able to solve. It is crucial for professionals to solve 100% of the shots they want to match-move, and for that, the software has to allow them to control and guide the process if necessary”.
3D Equalizer tech came about in 2000 which was an advanced camera and object match-moving software.
Boujou has allowed for seamless combination of live action and 3D.
2D Tracking
Most modern trackers are based to a luminance of black and white version of the track box. Digital Fusion alone uses colour information automatically, the interface shows the strength of each 3 colour channels and which it has decided to use. Shake allows the ability to use hue or saturation for tracking instead of luminance. Ron Brinkman pointed out that in some cases of multicoloured patterns this can be very useful and effective. Try a hue track if the monochrome version of the image has poor contrast. After Effects uses luminance as the preferred colour channel for the track (there is a switch to control RGB or saturation).
Ideal on-set Tracking Marker
Ensure that the tracking markers are not something which are not going to blow away. Common tracking markers are crosses, triangles or circles. Each software has a different algorithm, therefore prefers one shape to another. AE for example, doesn’t really prefer one shape to another. The question of which marker also relies on what the camera is doing, strictly speaking a circle provides a more robust track point (better for zooms, and focus changes). There is no right or wrong answer – but is dependant on many factors. LED markers were considered but were found were only more effective in low light conditions. LED’s have high contrast but there is the light and glare from the light. Most experts agree that there is little point in running special colour corrections, tracking algorithms generally already have some sort of colour correction built-in.
Dealing With Noise in tracks
A key aspect to tracking is reducing jitter. You can remove jitter by deleting keyframes and allowing an interpolator to fill in the gaps.  A small blur could reduce noise. PFTrack has a very comprehensive noise reduction algorithm called ‘de-noise for noise reduction’. These are specifically designer algorithms that act similar to a median filter but maintain all edge detail. 3D Trackers produce a better track by tracking a video resolution image. 2D cannot.
3D Tracking
Steven D. Katz explained 3D Tracking; “Camer-matching software utilises a subset of projective geometry called epipolar geometry. This branch of mathematics is used to describe the geometric relationship between two optical systems viewing the same subject and can be used to locate points in space. Because a moving camera offers a new view every frame, epipolar geometry works for a single moving camera as well, and each new view is understood as a separate optical system:.
3D Tracking can provide solutions for measuring objects at a distance, projective geometry, camera angle projection. The 3D Tracking program is used to solve the camera position by using 2 camera views. Epipolar geometry is a type of triangulation. Epipolar geometry proposes that the point we are interested is actually constrained to a single line, which limits the search for that point.
Ideal Tracker for 3D Tracking
3D Trackers rarely use colour information. 3D Tracking experts have very strong opinions on the ideal markers, which strongly supports triangular markers. The advantage is that a triangle provides 3 well-spaced, high contrast corners for automatic trackers to pick up. Accurate recording of on-set measurements allow the user to track the room to the camera, so that the artist has a virtual set to move around.
3D Tracking falls into 2 categories;
-       Automatic Trackers
-       Manual Trackers
Automatic point generation is typical to 3D tracking. This allows the software to track highly complex footage, such as tracking water.
Dealing with Lens Distortion
Lens distortion has very little influence on 2D Tracking, but can have a huge influence on the calculation processes involved in 3D tracking. Most 3D camera packages have tools to adjust for lens distortion. Lens distortion only effects the camera solving.
In MatchMover, ld is represented by a mathematical model, whose parameters are computed automatically based on point tracks. If too complex a calibration pattern may be applied. The most powerful 3D packages additionally allow for extra information (or images) to be added.
The future to 3D Tracking will likely ne linked to optical flow technology, where every point is tracked and produces a set of floating point data once combined with 3D tracking. Some 2D features may be applied to calculate a 3D match-move and some may be used to calculate optical overflow. These provide very reliable solutions. 3D tracking is fundamentally about tracking a camera, not necessarily things in the scene.
Ron Brinkman commented that “There’s no doubt in my mind that tracking will continue to evolve in a direction that relies on characterising the entire image as much as possible, as opposed to specific feature-based analysis. Ultimately if your starting point includes information about the camera (derived from 3D analysis) than any additional tracking methods will have their accuracy improved dramatically”.
 
http://www.fxguide.com/article212.html