This, the final talk I’ll post from GDC’08, centred on the development of the first company-wide technology platform (or engine) for Square Enix. Despite the heavy tech-focus, this was the largest lineup I attended at the conference due to the chance of gleaning any information from these Japanese RPG masters.
Taku Murata – General Manager, Technical Research Division
Traditionally, a new platform was created for each title, with the game first made in Japanese and translations following much later. This looks set to change with the latest upcoming releases which will be very exciting to many western fans, and the target platforms (for the engine) are PS3, PC and XBOX360.
Murata’s history reads like something of a chronology of technological breakthroughs in Japanese game development, with much of his work driven by animation – in particular facial animation. Of interest most of all was the admission that several of the driving forces for this new engine centred on displaying characters’ faces to a very high fidelity in close-up.
1997 – Final Fantasy Tactics: First time using real-time previewing on the console, convincing Murata of the power of this approach. The game was edited on PS1 in realtime, driven by the artists’ requests.
2000 – Vagrant Story: Used a unified tool to create cutscenes, preview textures and visual effects. Apart from the opening FMV, every cinematic was in-game and featured skeleton/bone animation for the first time. One requirement was that the team had to preview facial texture animations to ensure they looked good without anti-aliasing. Murata spoke at length about what he called “peak-points”, which are presumably normals. Apparently they had to change the peak-points relative to the camera to maintain the facial integrity. Lots of post-effect and field-of-view corrections, such as a fish-eye lens filter, were employed in order to promote a wide range of facial expressions.
2004 – Data Standardisation: Established a common 3D data format. Within the company there was a big debate over whether to use COLLADA, FBX or a proprietary file format, with each team previously using a different format. They eventually decided on their own proprietary data format, but the integration process wasn’t easy, especially when convincing certain entrenched teams.
2005 – Tech Division Established: Murata’s team was formally created with the objective of establishing a company-wide technology.
2006 – Final Fantasy XII: Team sizes by this time were incomparable to previous projects and as such brought with them large volumes of assets required to be created by staff with diverse skill levels. To aid this, Murata’s team created separate tool sets for different needs and skill levels.
2007 – Crystal Tools: The company-wide technology was finally rolled out. Previously named the “White Engine” , the Crystal Tools platform is the result of Murata’s team’s work. The essence of the Crystal Tools was described as thus:
Must support extensive use of character close-ups.
Focus on stylised facial expressions designed to promote the anime style.
Must allow detailed control of characters.
Specialises in physics, visual effects, post-effects and Graphical User Interface.
Must support a large team, with a detailed division of work.
Contains seperate tools for different functions.
With an extensive use of GUI to accommodate veterans and novices alike and focusing on ease of use, the Crystal Tools are currently being used for the production of Final Fantasy XIII, Final Fantasy XIII Versus and unannounced next-gen MMORPG. Separate teams for separate tools allowed v1.0 to be created in a year, though as is often the case in engine development they did not employ a technical writer so the documentation suffered. This is an area that game developers are slowly coming around to, where complex tools and system are created on an almost daily basis but oftentimes their power is not realised due to bad or lacking documentation.
Apparently, after one post-session questioner inquired, Murata admitted that Square Enix cannot currently license tools due to documentation, but possibly in the future – something very interesting indeed should the chance to work with their tools arise. Of all the tools mentioned, (Character Viewer, Effects Editor, Cutscene Editor, Layout Tool and Sound Maker), I fortunately managed to find screens online of the two most interesting to videogame animators.
Character Viewer: The Character Viewer is only for previewing of textures and animation and exists on PC only, with modeling and animation still created in Maya or XSI. It clearly displays a hierarchy view though, hinting towards additional character set-up that could be performed and maintained from within this external viewer.
Cutscene Editor: Bearing a resemblance to Unreal Engine 3’s “Matinee”, the Cutscene Editor offers timeline control over cinematography, visual effects and audio all in a single editor. The multi-screen view shows that this is an editor and not simply a tool in which to re-construct pre-exported cameras and cutscene assets, with an asset browser and animation curves clearly visible towards the lower right.