serth s3erg

Page 1



COVER ARTIST David D. Jimenez SOFTWARE Marvelous Designer, Substance Painter, 3ds Max, ZBrush, Photoshop, UVLayout, Corona Renderer David Domingo Jimenez is a Spanish freelance CG generalist with ten years of experience in animation, VFX and commercials. His specialities include sculpting, set/prop/character texturing, shading, lighting and more. www.ddjimenez.com

3D WORLD April 2020

3

3dworld.creativebloq.com


EDITOR’S WELCOME What’s hot this issue

SAVE UP TO *

67%

EDITOR’S

on your bundle subscription when you double up! – turn to page 28

WELCOME Explore the rise of the robots!

If you couldn’t guess from our glorious cover art, this issue we kick things off with a look at the perennial favourite topic of robots for our lead feature. I don’t know a 3D artist who hasn’t modelled some mecha at some point or another, and there’s such a diversity in style that we thought it was high time to investigate the art, artists and the genre itself. It seems like every issue there is at least one beautifully constructed VFX-heavy movie to dissect, but this issue we take a look at The Aeronauts which, with its period drama look and feel, isn’t an obvious choice. Read on however to discover some of the fantastic work that went into creating such a visual feast. We pride ourselves on offering you the best training and step-by-step guides to

help you boost your skills, and this issue is no different. If you have ever fancied creating a stunning wildlife scene, or wondered about using Megascans in your work, then turn to page 60 for a fantastic piece on just this, which includes tips on vital elements such as colour contrast and chromatic aberration, which both go such a long way to a fully realised render but, to many, are still dark arts. I hope you enjoy the issue!

Rob Redman, Editor rob.redman@futurenet.com

SPOTLIGHT ON OUR CONTRIBUTORS

David D. Jimenez David Jimenez shares his thoughts on robot design and creation in our feature starting over on page 20. www.ddjimenez.com

WEBSITE 3dworld.creativebloq.com

Paul Franklin DNEG’s creative director tells us about his journey from being a science fiction fan to a visual effects legend, on page 98. www.dneg.com

FACEBOOK www.facebook.com/3dworldmagazine

3D WORLD April 2020

4

Mike Griggs 3D World regular Mike Griggs returns with the latest instalments of our long-running Basics and Bootcamp series. www.creativebloke.com

INSTAGRAM @3DWorldMag

3dworld.creativebloq.com

TWITTER @3DWorldMag



ARTIST SHOWCASE 8 The Gallery 112 Technique Focus

FEATURES 20 Rise of the Robot Expert bot builders share their tricks and techniques for creating amazing robots

30 Animal Logic: Part 1

54 Turbocharge your motion graphics How to use the powerful Redshift Object tools in Cinema 4D

60 Make a photorealistic nature scene Create a realistic render with Quixel Megascans

68 Create a Disney Pixar-style project Build a scene inspired by Pixar

74 Simplify your blend shape creation

The first instalment in a new series delving into the inner workings of the famous studio

Antony Ward delivers a guide to efficient blend shape creation

38 The Aeronauts

We continue our look at the core digital content creation apps

Go behind the scenes of the impressive visual effects in adventure film The Aeronauts

20

78 SideFX Houdini Basics 80 Bootcamp: Gravity Sketch Find out more about this VR tool

THE PIPELINE 46 Render like Pixar: Part 3 Discover how to harness the power of RenderMan

ARTIST Q&A 82 Your CG problems solved Pro artists tackle your CG queries

38

3D WORLD April 2020

6

3dworld.creativebloq.com


THE HUB

54

SIGN UP!

88 Meet the artist: Tim Doubleday

For content direct to your inbox, subscribe to our weekly newsletter at: bit.ly/3dworldnews

The VFX product manager at Vicon talks his career journey

92 A day in the life Stunt performer Eric Jacobus describes a typical, jam-packed day at action design studio SuperAlloy Interactive

60

96 Pro thoughts: Lorna Burrows What does the future hold for VR and AR technology?

98 Dream a little bigger DNEG’s creative director Paul Franklin details his journey from sci-fi fan to VFX legend

100 Animating in real time e.d. films discuss their short films

104 Into the blue Dell partners with The Mill to promote their new products

106 Open-source 3D for illustrators The highlights of OSS tools

REVIEWS 110 Gaea How do we rate this terrain design software?

REGULARS

30

28 Subscriptions Subscribe and save with our new special offers!

114 Free downloads Images and videos from our tutorial section

SAVE UP TO

67% on your subscriptions

100 3D WORLD April 2020

7

3dworld.creativebloq.com


The best digital art from the community

3D WORLD April 2020

8

3dworld.creativebloq.com


CG art to inspire

PLUTONIUM GAS STATION ARTIST Anna Zelensky SOFTWARE Blender, Photoshop

Anna Zelensky has been studying 3D art for the past three months, having spent the last two years working exclusively in 2D. At the moment Zelensky devotes her time to studying English and enhancing her 3D skill set. This neon-soaked image is one of her first complete works and took just four days to complete. This image was made as part of the Polygon Runway course by Roman Klco. A fairly straightforward

3D WORLD April 2020

9

workflow brought it to life, beginning with a series of sketches that figured out large and medium shapes. “I then created large cubes in Blender and decided where to put them and what piece will come next,” says Zelensky, “then I moved on to smaller shapes before clarifying the details.” The process was not without its challenges, however, with Zelensky still learning her way around Blender. The highlight of Zelensky’s workflow is painting objects and creating the feel of an image. “I believe that a competent ‘feel’ makes the art magical,” she adds. artstation.com/anyanna

3dworld.creativebloq.com


CG art to inspire

BIG GIRL ARTIST Dávid Fatér SOFTWARE ZBrush, Maya, Substance Painter, V-Ray, Photoshop

Freelance 3D artist Dávid Fatér has been working in CG for a year, mostly creating characters for games and animation. For this image, the entire model was hand sculpted and poly modelled in ZBrush. “I loved sculpting her hair and clothes because those parts are very relaxing for me,” he adds. Fatér enjoys the texturing process, often trying out new methods picked up from fellow artists. DynaMesh or Sculptris Pro is utilised for the organic elements of Fatér’s work, where he models until he is happy with the form and silhouette. Fatér is inspired by his dreams of working for a big company and creating his own games, animations and stories with friends. “I’m in love with failure, learning, sharing and experiencing new things outside my comfort zone,” he adds. “I get a lot of inspiration from other artists all over the world, this community is huge and very powerful, I’m absolutely loving it.” instagram.com/david_fater

3D WORLD April 2020

10

3dworld.creativebloq.com



CG art to inspire

ROYAL GUARD ARTIST Steven Jia SOFTWARE ZBrush, Maya, RizomUV, Substance Painter, Marmoset Toolbag, Photoshop

This regal warrior image was inspired by a concept from artist Kyoung Hwan Kim and took freelance 3D artist Steven Jia a total of 25 days to complete. “The process I used to create the hair may be the only technique that wasn’t typical,” he explains. “I used FiberMesh in ZBrush to get the curves, then I turned the curves into the hair cards using Bonus Tools, which is a free plug-in for Maya. It can control the individual direction of the hair card, which is really handy.” Jia’s personal highlights were painting textures in Substance Painter and tweaking lights and effects in Marmoset Toolbag. He adds: “The model doesn’t look grey anymore, it’s delightful to look at the changes made by a colour, a material, or a direction of light.” Three lights usually form the basis when Jia renders scenes in Marmoset Toolbag: one directional light, a spotlight, and an omni light without colour. “Then I tweak light locations and brightness to adjust the shadow changes,” he continues. “After that, additional omni lights are added to provide richer light and shadow transitions to the scene. Lastly, I’ll add colours to the lights and apply some camera effects.” stevenjia.artstation.com

3D WORLD April 2020

12

3dworld.creativebloq.com



CG art to inspire

3D WORLD April 2020

14

3dworld.creativebloq.com


CG art to inspire

POWERPUFF GIRLS VILLAINS ARTIST Mohamed Chahin SOFTWARE Affinity Photo, Blender

Mohamed Chahin is a freelance game artist with experience in animation, branding and UI. “This image took me two weeks,” he explains. “I created each of the villains separately then after a while I decided to combine them all into one render.” Most of the characters were achieved with straightforward modelling from reference. “Using metaballs for HIM's fur lace thing and Princess Morbucks' hair was really fun,” adds Chahin. “For HIM's lace, I used a mix of volumetric, transparent and subsurface scattering shaders.” Each villain in this image introduced Chahin to a new challenge, “since in the original cartoon all of them are flat and very stylised, I had to play around until I got the look right in 3D,” he adds. Once he has collected a ton of reference images, Chahin begins modelling. With the rough shapes in place he adds and tweaks the lighting, before adding materials and shaders. He continues: “If something needs a specific texture either I create it or look for one that suits the scene online and apply it there. Once everything is in place I render a high-resolution version and voilà, I move to the next project.” artstation.com/mchahin

3D WORLD April 2020

15

3dworld.creativebloq.com


CG art to inspire

LEGION ARTIST Sadan Vague SOFTWARE ZBrush, Substance Painter, Marmoset Toolbag

Freelance 3D artist Sadan Vague based this demonic model on a 2D illustration by tattoo artist Derek Noble. He describes his approach to the character as “nothing unusual, just good old sculpting.” Vague’s freelance career sees him work frequently with 3D scan data across video games, film and advertising, and he also dabbles in traditional clay sculpting. For Vague, the most enjoyable part of the process was adapting a piece of 2D art into a 3D character, all while keeping the spirit of Noble’s original piece alive. The entire process took him around three or four days to complete. Vague normally begins the creative process without any clear ideas in his head. He adds: “I fight with every new sculpture I do, because having no clear idea usually causes a lot of problems.” A typical workflow for Vague usually incorporates a variety of software, including ZBrush, Marvelous Designer, Maya, XSI and more. “Another, probably even more important challenge is to keep my own voice,” Vague continues, “because it is easy to get lost among the thousands of other artists.” artstation.com/sadania

3D WORLD April 2020

16

3dworld.creativebloq.com



The Rookies

The Rookies is a platform to help digital artists get discovered without having to compete with professionals for attention. You’ll get to the front page of those sites one day, but for now, we’ve got your back and want to help turn your passion for creative media into a successful career. WWW.THEROOKIES.CO

3D WORLD April 2020

18

3dworld.creativebloq.com


LISBETH SALANDER This character was created as part of my Advanced Term final project at Think Tank Online. The task was to create a full film character. I decided to do a likeness of Rooney Mara as Lisbeth Salander from The Girl With The Dragon Tattoo, because of the actress’ striking features and the character’s aesthetic. The project taught me a lot about the struggle of capturing likeness and creating clothing for production. artstation.com/archie3d

3D WORLD April 2020

19

3dworld.creativebloq.com

YEAR CREATED 2019 SOFTWARE Maya, ZBrush, Marvelous Designer, R3DS Wrap, xNormal, Knald, Substance Painter, Mari, V-Ray, Photoshop

ARTIST Archie Majumdar Student at Think Tank Online, with a concentration on 3D characters for film. Always exploring new techniques to bring a character to life. LOCATION Sydney, Australia


Rise of the Robot

OF THE

ROBOT obots have been an enduring presence in the 3D art community for decades, with creatives constantly looking to put their own stamp on the sci-fi concept. From androids to mech suits, household droids to war machines, artists’ portfolios are awash with mechanical marvels. Just what is it about robots that keeps 3D artists coming back? And how do different artists approach the subject? 3D World has gathered some of the very best bot builders to tell us just that. Steve Talkowski, a character designer, 3D modeller and VR artist based in Los Angeles, has his own ideas about the enduring appeal of robots in CG circles. “They are a fun way to anthropomorphise the continual progress of technology

R

and AI,” he tells 3D World. As technology advances in the real world, so too do artists’ robotic creations. This constant stream of inspiration ensures the metallic characters never go out of style. “We have grown up in a world where sci-fi has become possibilities rather than just fantasy,” says Garreth Jackson, a Serbia-based hard-surface artist and concept designer with a passion for sci-fi artwork. “Sci-fi films, video games and novels play an influential role in the designs and concepts that artists come up with,” he continues. “The realities of AI, virtual reality and advancing technology are no longer far-fetched concepts in futuristic movies, but are now our present, and what we artists create reflects this.”

3D WORLD April 2020

20

3dworld.creativebloq.com


Rise of the Robot

The

top

a r t i s t s b e h i n d CG

robots

discuss their creations, as

well

as

sharing tips, tricks

and

techniques so that you can

build

your own


Rise of the Robot

David Domingo Jimenez is a lead texture and shading artist for set and props at Ilion Animation Studios in Spain. Discussing the appeal of robots to 3D artists, he adds: “Making a robot is always a fun challenge, with their complex pieces, gears and prostheses. Robots have always existed in pop culture and even more so today, when technology has evolved so much that they even exist in real life. Robots are always in fashion.” For visual designer Jarlan Perez, the fascination with creating robots is more existential. He explains that artists can explore a complex series of emotions and questions through the lens of a 3D robot character. “I enjoy creating robots that don’t follow the typical image that comes to mind when we think of robots,” he explains. “Instead, I look at these bots as being on a journey of self-discovery, like us. Built for a predetermined purpose but wanting to find meaning of their own, even if it conflicts with their physical design.”

GRAND DESIGNS The word robot often conjures up images of a calculating and unfeeling machine, but Perez maintains that this doesn’t have to be the case. “Robots don’t have to be cold machines with little to no personality on a mission to eradicate human life,” he asserts. This comes across in the fun and vibrant style of Perez’s robots, something he developed gradually over a period of exploration. Several design principles govern Perez’s process, as he explains: “For detailing I stick to a 60/40 or 70/30 white space to details ratio. These details are often reserved for areas or components of motion.

Top: Jackson went outside of his comfort zone for this cyberpunkinspired character Below: Gwyllgi - The Dog of Darkness, created by Garreth Jackson using Cinema 4D

This is important as I want the designs to be clear to the viewer with plenty of rest space for their eyes. Most of the inner workings of these robots are hidden with clean exterior shells.” Perez holds back on the details that characterise the robots of so many other artists. “It’s easy to get carried away and over detail to the point where the design doesn’t read clearly,” he explains. To maintain a balanced aesthetic, Perez makes sure to vary the look and scale of any details in his designs, while never skimping on functionality. “Obviously we’re not engineering fully functional machines, but they have to at least appear functional to the viewer. If a piece needs to bend, add the proper joints that allow it to bend and rotate. People will pick up on that sort of thing.”

3D WORLD April 2020

22

3dworld.creativebloq.com

An artist’s individual style will depend on the story or feeling they are trying to convey, which in Perez’s case is a sense of fun and warmth. “A unique style certainly helps people recognise an artist’s work easier,” he says. “In the workplace we don’t always get to use our own style, so we also have to be flexible and open to trying new things. Over time our style becomes a collection of the many micro lessons that we pick up throughout our journey.” Functionality is also key to Talkowski’s approach to robot designs, “they should imbue characteristics that define their designed functionality,” he asserts. However, he admits that the exact opposite can be unexpected and appealing if done correctly. Talkowski also forgoes the extensive details often


Rise of the Robot

MODELLING A ROBOT HARD-SURFACE MASTER GARRETH JACKSON SHARES HIS TOP TIPS

01

DO YOUR RESEARCH

Immerse yourself in the world of robots, sci-fi, cyberpunk, etc. Understand how robots are used or perceived and how they would fit into our society. Researching movies, video games, comics and art is a great way of gaining inspiration and seeing what’s out there.

02

CREATE A SOLID STORYLINE

03

CONSIDER DESIGN STYLE/MATERIALS

Make sure that you think up a strong purpose, identity and story for your robot. This will give you enough material to work with when modelling the robot, as well as creating the background and working on final finishes.

Steve Talkowski launched Sketchbot Studios in 2008, marking his first foray into the designer toy scene

Consider the look of your robot, the shape of the head and body – are you going for a boxy, angled look? Or a smooth, round look? The style will also influence the materials you use to texture your robot, as well as the final finish. All of this ties into the story and identity of your robot, so make sure that it is well established.

04 KNOW YOUR TOOLS Knowing the software that you use to create your model and using it to its full potential is key to a successful design. Spend time researching and getting to know the features of the software you’re using so that your finished product looks its best.

AND LIGHTING 05 RENDERING

Lighting can create a sense of realism and gives your materials dimension and texture, while also making your environment come alive. It sets the mood and tone for your final composition, so taking the time to get it all just right is really important.

3D WORLD April 2020

23

3dworld.creativebloq.com


Rise of the Robot

associated with robot designs. “Some designers like to go into the minutiae of endless nuts and bolts,” he says. “I love seeing these examples. My style tends to emphasise the volumes and shapes more, and less on the nitty gritty – while still trying to make the parts work realistically and believably.” Personality is integral to Talkowski’s designs and he highlights several elements that can add some individuality: “Expressive eyes will do wonders for any character and putting

Jimenez’s work combines his proficiency in hard modelling, sculpting,texturing, shading and lighting

Below: Jimenez modelled this robot character in 3ds Max and ZBrush with detailed parts later textured in Photoshop

the robot into comical poses also greatly adds to its personality. Colour can also be used to take away preconceived notions of cold and hard-edged technology.” For Jackson creating a robot design is a balancing act of good design principles and clean, wellproportioned models. He adds that artists should bear this in mind while allowing room for their creativity and imagination to push boundaries and think outside the box. He continues: “In this way you avoid creating an overly generic

design or one that is too flawed from a design perspective.” A unique style can also be found by embracing art and culture from around the world. “It’s a very Asian-influenced style,” says Jimenez of his own approach to robots, “with electric colours and complementary colour combinations. The truth is that it always takes me a long, long time to define the final colours. It’s very important to me.”

THE IDEA FACTORY Using 3D software to build a breathtaking 3D robot is only half the task, first artists need to come up with a concept for their character, something that each of them approaches differently. Jimenez always has an image of what he wants to create in his head, rarely making a concept more detailed than a paper sketch. “I usually create my robot designs using references from real objects and mixing them,” he adds. Once Jimenez has gathered references he models them directly, modifying each object until he’s satisfied with the outcome. “It’s more of a creative than logical process,” he explains. “My initial inspiration always came from the anime series FLCL and the character of Kanchi, as well as anime and Asian culture generally.” He cites some of the leading artists in this area as Ian McQue, Yeon Guun Jeong and Simon Stalenhag. This influence is evident in the simplicity of Jimenez’s forms and the humanisation of his robots. The initial stages of Perez’s artistic approach involve their fair share of trial and error. “I usually start a vague idea that I’d like to explore,” he says, “something like a bot that carries water. Then I’ll ask myself, why is it carrying water and what was its original purpose?” Contemplating their function and origins helps Perez to begin exploring shapes and honing the design of his robots. “Finding and creating an interesting shape can take a good amount of time,” he continues. “Organic hard surfaces are an area that I am working overtime to be better at.” Despite the more simplistic language of his designs, Talkowski

3D WORLD April 2020

24

3dworld.creativebloq.com


Rise of the Robot

strives to not repeat certain ideas when plotting out his robot characters. “This means trying to make sure my joints and mechanics have a believability to them,” he explains, “so that when they are built and eventually animated, they perform as expected.” Part of Jackson’s process is to dream up backgrounds and stories for his robots. “It’s important to understand your creation. Having a specific background and story influences the overall design and directs the robot to be more than just an object, but rather a character with a purpose and story.” Jackson gives each of his robots a name, personality and identity as part of his conceptualisation process. “I especially consider what they would be used for,” he continues. “For example, are they an

“A SPECIFIC BACKGROUND AND STORY DIRECTS THE ROBOT TO BE MORE THAN JUST AN OBJECT” Garreth Jackson, hard-surface artist and concept designer engineer? Or used for combat? And so on. Knowing their identity helps to create the details that give the model dimension and substance.” Where many artists would translate their initial ideas into a 2D illustration or rough sketch before utilising 3D tools and software, Jackson chooses to work directly on a 3D model, working from his internal vision of what the final piece should look like. “The identity and final look of the character will develop as I build the different parts, and will sometimes change from what 3D WORLD April 2020

25

Right: This endearing robot is Steve Talkowski’s entry for day 17 of Botober back in 2017

Steve Talkowski has previously worked on films such as Ice Age, Alien Resurrection and award-winning short, Bunny

3dworld.creativebloq.com


Rise of the Robot

I originally envisioned, but the story and identity of the model will dictate the final design,” he adds.

BUILD-A-BOT When it comes to assembling their robots, each artist has a unique approach, utilising different software, tools and techniques. 3D World asked its experts to explain their workflows, in order to discover how they create their mechanical masterpieces. After his initial sketches have been completed with pencil and paper or stylus and iPad, Talkowski dives straight into Oculus Medium VR, ZBrush or Maya. With the model complete he moves it over to Substance Painter for texturing, before using either Redshift, RenderMan or Arnold for GPU rendering. “3D allows me to quickly visualise the shapes and forms of my robot,” he explains, “and I try to keep them as close to the initial sketches as possible.” With an idea in place, Jimenez dedicates several days to collecting references. These range from random objects to industrial

pieces, as well as television sets from the mid-20th century that always form the head of Jimenez’s robots. He begins by blocking and creating a proxy with the basic volumes. “Then I go piece by piece, detailing it in its high-poly version and doing the UVs,” he adds. Jimenez’s robot designs often feature intricately detailed body parts, all of which he sculpts by hand in Maya. “For my next illustration I plan on trying Blender for its Boolean system,” he reveals. “I am quite a craftsman and not a man of many tricks. For example, if I have to make a cord, I model it by hand and make the loop just as I would in real life.”

To create his robots Perez employs his proficiency in modelling, game development, PBR texturing, physically based rendering and VR

Perez uses 3ds Max to sculpt his unique robot characters, rendering in OctaneRender for an eye-catching end result

“I FEEL LIKE THE DESIGN IS ASSEMBLING ITSELF AND I’M JUST THERE TO CREATE AND MOVE THE PIECES AROUND” Jarlan Perez, visual designer

3D WORLD April 2020

26

3dworld.creativebloq.com

Once everything is modelled, Jimenez begins texturing, first extracting the mesh maps and using them as a reference for the maps he creates in Photoshop, although he occasionally relies on Substance Painter for ease and speed. He continues: “I then generate the rest of the maps and work on the materials. The creative process is simple and quite linear in my case.” Perez dives straight into 3ds Max or ZBrush to explore shapes and ideas, before creating the pieces and beginning assembly. “Over time I’ve built a fairly extensive kitbash library of components that I can pull from if it fits the design or I build and modify what I need,” he adds. When creating these exploratory pieces, Perez says that he tries to avoid being too prescriptive and following a fixed design path. “I’m exploring what works and what doesn’t,” he continues. “I often feel like the design is assembling itself and I’m just there to create and move the pieces around until it’s found its form.”


TEXTURING MACHINES DAVID DOMINGO JIMENEZ SHARES ADVICE FOR TEXTURING YOUR ROBOTIC CREATIONS

01

USE MESH MAPS

Mesh maps like ambient occlusion, cavity and position are essential – you can’t do something specific and realistic without the use of these maps when creating mixing masks.

02

NOISE MASKS

03

PHOTOSHOP TRICKS

Dirt, scratches and humidity maps are all important as they are the essence of a good texture. Do not use procedural noise masks for everything, use different for different parts.

Match Color is a wonder that will help you balance and harmonise all your textures with the same spectrum of colour and luminance. Reducing the noise to a force of 5 and the rest to 0 will help you to stylise textures that are too noisy without blurring them.

04

PAY ATTENTION TO DETAILS

No realistic material lacks dirt or micro roughness. Each material, each piece of dirt, each different layer has different properties, even if they are subtle – pay attention and define them. Even the most imperceptible details count.


SAVE UP TO

67% *

WHEN YOU DOUBLE YOUR SUBSCRIPTION!


SUBSCRIBE!

THREE OFFERS TO CHOOSE FROM! Option 1: Annual print subscription, £65*

• 13 issues of the 3D World print edition • No-hassle delivery • Never miss an issue

Save up to

28%

Option 2: Annual print + digital subscription, £75*

• 13 issues of the 3D World print edition • Instant access to digital edition via your iOS or Android device • No-hassle delivery • Never miss an issue

Save up to

58%

Option 3: DOUBLE UP! Annual print + digital subscription, £120*

• 13 issues of both the 3D World and ImagineFX print editions • Instant access to digital editions • Gain insights from industryleading 2D and 3D artists

Save up to

67%

SUBSCRIBE AND SAVE BY VISITING:

MYFAVOURITEMAGAZINES.CO.UK/TDW/DBUNDLE Spool Bot by Steve Talkowski

Offer available to new subscribers worldwide!

*Terms & conditions This offer is available to all-new subscribers. Double-up saving will be automatically applied at checkout. Subscriber will need to add both ImagineFX and 3D World print + digital bundles to the basket separately for offer to apply. Prices and savings quoted are compared to buying full-priced print and premium subscriptions. You’ll receive 13 issues per subscription. You can write to us or call us to cancel your subscription within 14 days of purchase. Payment is non-refundable after the 14-day cancellation period unless exceptional circumstances apply. UK calls will cost the same as other standard fixed line numbers (starting 01 or 02) or are included as part of any inclusive or free minutes allowances (if offered by your phone tariff). For full terms and conditions please visit www.bit.ly/magterms. Offer ends 30 April 2020.

April 2020

29

youtube.com/user/3dworld


PART 1: ART AND STORY

ANIMAL LOGIC This new series explores animation studio Animal Logic, starting with a behind-the-scenes look at the art and story departments


Animal Logic: Art and Story Members of Animal Logic’s art department at the Sydney studio

nimal Logic is one of those rare visual effects studios that made a successful jump into being an animation studio. It was behind the Oscar-winning Happy Feet, as well as Peter Rabbit, Legend Of The Guardians: The Owls Of Ga’Hoole, The LEGO Movie and subsequent LEGO films. In the VFX world, Animal Logic has worked on countless projects, including The Matrix, Moulin Rouge, 300 and several Marvel productions. Animal Logic’s development arm, Animal Logic Entertainment, has propelled the studio further into being directly involved as a producer or the developer of specific film properties – something often seen as the ultimate dream for studios wishing to work on their own intellectual property. A significant player in the development, animation and visual effects activities of Animal Logic is its art and story departments. It’s in these teams where ideas get fleshed out into the visual medium, where characters and environments are developed and where continual iteration happens throughout the filmmaking process. In this first part of our new series on Animal Logic, 3D World receives a hands-on tour of the art and story worlds at the studio, including a look at what’s required to work in art direction and story, tales from recent productions and a peek at various tools of the trade. This in-depth Animal Logic series will continue in 3D World with behind-the-scenes access to the studio on their VFX and animation technology and pipeline, a look at the camera and animation teams, how lighting and FX are handled, and a special focus on Peter Rabbit 2.

PETER RABBIT™ images © 2018 Columbia Pictures. Legend of the Guardians: The Owls of Ga’Hoole images © 2010 Warner Bros. Pictures. The LEGO Batman Movie images © 2017 Warner Bros. Pictures. Captain Marvel image © 2019 Marvel Studios.

A

WHAT HAPPENS IN ART AND STORY At first, it might seem obvious what takes place in the art and story departments for an animation studio: this includes concept art, storyboarding and design work. But, as art director Felicity Coonan points out, there is a lot more involved. “I think of our role as interpreting the director’s vision into all the


Animal Logic: Art and Story

Above: An art department environment concept for Batman’s cave in 2017’s The LEGO Batman Movie Right: Animal Logic’s concepts for LEGO Batman had to communicate a tone and mood that would be followed throughout the rest of the pipeline

“IT’S GREAT TO HAVE A HAND IN THE PRODUCTION FROM THE VERY GENESIS” Toby Grime, art director, Animal Logic

Far right: A final shot from Captain Marvel featuring a hologram designed and executed by Animal Logic

different departments. We have an opportunity to craft the look of the film, to create environments, characters and cultures within the world. I sometimes refer to it as ‘digital anthropology.’ We then handhold that all the way through the process, from early concepts, storyboards, asset creation all the way through to marketing art.” “We’re probably one of the only departments that would be involved from before a film is

even green-lit working all the way through production and even onto the end credits,” adds art director Toby Grime. “For example, on the first Peter Rabbit film, we were there four years before delivery developing a visual pitch package to support the writer – creating concept art and so forth for presentations and creating a deck to support the producers to present to the studio. As a department it’s a great feeling to have a hand in the production from the very genesis to the final frame.” In terms of story, that work is typically handled by story artists at Animal Logic. Scott Hurney, story lead on the upcoming Peter Rabbit 2, says that a “story artist has to visually depict a script, or if there is no script, visually depict ideas that help to create a story or script.

3D WORLD April 2020

32

3dworld.creativebloq.com

Story sketches or storyboards create a visual blueprint for a film.” “Day by day,” continues Hurney, “a story artists’ duties are to apply their excellent drawing skills and knowledge of film language to draw scenes and sequences for a film. The storyboards need to convey acting, action, environment and mood. Story sketches and storyboards can be verbally presented and ‘pitched’ to a director or supervisor or edited together and presented in the form of an animatic. Storyboards are often re-iterated and re-drawn many times, as the film or project is continually developed and refined. Story artists often also need a thorough understanding of story structure and theory to help visually create and provide ideas for a story or film.”


Animal Logic: Art and Story

GET TO KNOW THESE TOOLS WANT TO WORK IN ART AND STORY? ANIMAL LOGIC’S TEAM BREAKS DOWN THEIR GO-TO TOOLS FOR PRODUCTION Think outside the box: I mostly use the Adobe suite, but really you should just use whatever's required for your problem. I was working on a project for a circular screen and I had trouble getting my head around how it would work. So I would print things out on paper and just stick them back together in a circle. That was my previs: cutouts and sticky tape. – Toby Grime, art director Fonts and VR: I'm likely to get out real brushes and inks. I also love making fonts, and I've got a nice little tool called Fontself Maker which plugs into the Adobe tools. Production designer Kim Taylor also used Tilt Brush here to create the jungle sequence for The LEGO Ninjago Movie, which let the director do a walkthrough in VR. – Felicity Coonan, art director

PROBLEM SOLVERS It is within the art and story teams at Animal Logic where complex story or design issues need to be solved at the early stages – and even late stages – of a film’s formation. Here’s a look at some ways in which the teams have tackled story and design challenges over the years. On The LEGO Batman Movie, production designer Grant Freckelton recalls that one of the biggest challenges was emphasising Batman’s loneliness through his environment and his reliance on technology. “Batman’s mansion and cave were vast and layered with technology, the biggest version of the Bat Cave yet seen on film,” Freckelton notes. “On one level this was meant to parody other superhero movies and our tendency to fetishise wealth and tech in

modern blockbusters. But it also served as a vast backdrop to drive home how sad a figure Batman is when he’s not punching bad guys.” Freckelton also flags the challenges the team had on The LEGO Movie 2: The Second Part, where production designer Patrick Hanenberger and effects supervisor Mark Theriault and their teams in Vancouver needed to design a new inter-dimensional portal for the ‘stair gate’ sequence. “Rather than relying on sophisticated FX, they printed out sequenced frames of the layout and asked the wider crew to rotoscope whimsical doodles around the 3D layout using pencil and crayon. It was much more fun and creative and the results resembled the hand-drawn effects from the 1980s music video for Take On Me. I 3D WORLD April 2020

33

Real-world tools: I’ve been breaking into Maya and ZBrush, and I use a Wacom Cintiq, but I also like real-world traditional tools like pencil and paper. It's always good to break out of the digital world and get back to the roots to get the feeling and the emotions while illustrating. Even with LEGO, I would get pieces of paper and chop out things to try and work out how it would fit on them. That was really, really fun. Even using blu-tack to try and form a wig on a LEGO character head. – Fiona Darwin, concept artist/art director Cutting storyboards: I use Photoshop and Toon Boom for storyboard work. For editing storyboards, I like Premiere. It has, alongside all the cutting tools, a designated re-timing tool where you can cut then stretch. This feature is probably completely impractical for editing live-action footage, but if you're trying to prototype the scene with previs or storyboards, it's the best thing. – Simon Ashton, storyboard artist

3dworld.creativebloq.com


CRAFTING A SCENE FOLLOW ALONG WITH THESE STORYBOARD PANELS FROM PETER RABBIT, DRAWN BY SIMON ASHTON FOR A SCENE IN WHICH PETER AND MR. MCGREGOR FIGHT

3D WORLD April 2020

34

3dworld.creativebloq.com


Animal Logic: Art and Story

guess that made their solution a literal ‘A-ha’ moment.” Art director and concept artist Fiona Darwin also highlights the LEGO films as an example of where the art and story teams needed to solve specific design problems, while doing so within the confines of the LEGO brand ‘style guide’ that restricted brick types, colours and even the way pieces connected. Some of that could be worked out via the use of the publicly available LEGO Digital Designer (LDD) tool, while other aspects required more practical solutions. “I loved the challenge of getting in there and playing around with the bricks,” recalls Darwin. “I might be building the interior of

an airplane and deciding whether or not the seats are three studs or four studs wide for each of the characters, and what was going to be interactive about them and how that was going to affect different departments down the line.” For the first Peter Rabbit film, the script called for a fight between Peter and Mr. McGregor. “The script basically stated: ‘Peter and McGregor have a fight’, and that was it,” says Simon Ashton, who was a senior storyboard artist on the movie. “I had to come up with a scene that was not only comedic, but with violence that felt real and not one-sided. It couldn’t be Peter getting the upper-hand all the time, he had

3D WORLD April 2020

35

Below: Storyboard panels depicting a fight scene between the mischievous Peter Rabbit and Mr. McGregor, serving as a visual pitch for the action

3dworld.creativebloq.com

to take a beating also. At this stage we were still trying to define the comedic tone of the film so I referenced movies like Blake Edwards’ The Pink Panther and Rock ‘n’ Roll Wrestling footage. I loosely boarded it up and pitched it to the director, and he loved it!” In terms of visual effects, Toby Grime contributed to key designs for some holograms in Captain Marvel, noting that the team was given wide scope on how they should look and move at the art direction stage. “We came up with the idea that these holograms would expand and compress like an accordion. It’s handy to relate things back to very analog, physical things, which gets the idea rolling


Animal Logic: Art and Story

quickly. You’ve seen holograms and UI many times in films, so do you just go down a well-worn path or do you create a whole new look? If you make it too obscure, people don’t get what’s going on, which comes back to the golden rule: support the storytelling in that moment at all times.” As noted above, what’s interesting is that the art and story teams don’t tend to just be involved at the beginning of production. They can have an influence at all stages, as Animal Logic director David Scott, who has worked in story, art direction and VFX at the studio, attests. “Even modelling and blend shapes can be a heavy art direction task. You get the first pass of a model, but then it comes back to the art department for paintovers in Photoshop as a guide for corrective shapes. Art direction also extends into post-production at the lighting stage. After a review a shot might come back to the art department for a quick paint-over to illustrate where a keylight is, or to rebalance or grade an iris highlight for a character.”

ANYTHING IS POSSIBLE One thing is clear after spending some time with the art and story team at Animal Logic: they are extremely well versed in imagining

“WE’RE KEEN TO BE CHALLENGED. THAT’S WHAT I LOVE ABOUT DOING THIS WORK” Felicity Coonan, art director, Animal Logic a multitude of worlds, characters and scenarios, often fitting into what David Scott describes as ‘artdirected realism’, the closest thing he says to Animal Logic’s house style. Each project seems to outdo the previous one, and also pushes forward the studio’s technology (something that will be discussed in future parts of this series). Indeed, Felicity Coonan sums up her role as an art director at Animal Logic as sometimes being

3D WORLD April 2020

36

3dworld.creativebloq.com

about ‘breaking the machinery’. “We’ve got a really super-duper refined pipeline, but if we’re sitting comfortably in that lane then we’re not doing our job. We’re actually looking at a couple of projects now that are totally going to make the technical and R&D teams sweat. But we’re keen to be challenged. That’s what I love about doing this work.” Discover more at FYI animallogic.com


PRESENTING DESIGN ITERATIONS WITH PHOTOSHOP ANIMAL LOGIC CONCEPT ARTIST AND ART DIRECTOR FIONA DARWIN SHARES HER PHOTOSHOP TIPS FOR CREATING AND EXPORTING DESIGN ITERATIONS FOR PRESENTATION

1. GET TO KNOW LAYER COMPS Layer Comps let you take ‘snapshots’ of your setups in a single Photoshop file. It means you can cycle through your Layer Comps to show multiple iterations of a character or composition without having to consider folders and turn their visibility on and off manually. In the example here I have used Layer Comps to organise iterations of the Lady Iron Dragon character designs for The LEGO Ninjago Movie. The folders are colour coded so you can see how each is turned on and off as a different Layer Comp is selected.

Above: Animal Logic concept artist and art director Fiona Darwin Left (top): A character sheet for Peter Rabbit helped established different poses and expressions

2. LAYER COMPS AND ACTIONS You can utilise Layer Comps in actions to export your designs in specific ways. For example, when I am presenting character options in a lineup I need each of the character designs on their own transparent layer in order to quickly compile them in a presentation file for review. It can take a long time to manually duplicate, merge and tally iterations when you have a lot of them. To speed up the process I created an action that selects a specifically named folder, duplicates it, merges it and then selects the next Layer Comp. 3. READY FOR REVIEW Hit play on the action and a new layer is created for that iteration of the character after which the next Layer Comp is selected. Every time the action is played it results in a new layer made of the design in the active Layer Comp, and as the next Layer Comp is selected at the end of the action it can be played over and over again to create merged layers of every character design very quickly. Activating the action over and over again creates a flattened version of each design as a new layer. Each character iteration is now merged in its own layer and the unmerged artwork used to build each of them remains intact under the ‘01_Merge_all’ folder. Once all the designs are ready they can be easily compiled and aligned in a presentation file for review.

Left (bottom): Legend Of The Guardians: The Owls Of Ga’Hoole artwork informed colours, textures and different pieces worn by the characters

3D WORLD April 2020

37

3dworld.creativebloq.com


The Aeronauts

nspired by meteorologist and astronomer James Glaisher and aeronaut Henry Coxwell breaking the world flight altitude record in 1862 by soaring to 36,000 feet in a coal gas filled balloon, screenwriter Jack Thorne (His Dark Materials) collaborated with filmmaker Tom Harper (Wild Rose) and Amazon Studios to make The Aeronauts, starring Eddie Redmayne and Felicity Jones. With over 80 per cent of the action taking

I

place above England under various weather conditions, visual effects had a pivotal role to play in making the biographical adventure film a cinematic reality. Overseeing the CG contributions of Framestore, Rodeo FX and Alchemy 24 was production visual effects supervisor Louis Morin (Arrival), while the physical builds were the responsibility of co-production designers Christian Huband (Justice League) and David Hindle (Bohemian Rhapsody). “The one image that sticks in my mind when it comes to the sky

TREVOR HOGG LEARNS HOW PRODUCTION DESIGNER CHRISTIAN HUBAND AND FRAMESTORE WERE ABLE TO MAKE VISUAL EFFECTS THAT SOAR TO GREAT HEIGHTS IN THE AERONAUTS…



“Tom worked closely with the art department to make meticulous concept art of establishing shots” CHRISTIAN KAESTNER, VFX SUPERVISOR

was a print that was used in the book about the balloon journey that was done by Glaisher, and it was a stylised print that had The Mammoth above the clouds along with shooting stars,” recalls Huband. “This was a provocative, beautiful thing, slightly fanciful because I don’t think at 37,000 feet you can generally see the stars unless it’s at the end of the day or night. The decision to show the stars was something that has a

Top: Eddie Redmayne and Felicity Jones portraying their roles of James Glaisher and Amelia Wren Above: Filmmaker Tom Harper discusses a bluescreen shot with cinematographer George Steel

certain theatricality which feeds into the story and performances. We did a sky chart, which was a graphic representation of what altitude they’re at and the time from the butterfly scene to the snow and ice scene. In the end, you send a helicopter up, which they did over South Africa, Louisiana and in different parts of the world, to film plates and you have to work with what you get.” “Originally when we took on the project it was supposed to be bluescreen live-action, sixcamera array backgrounds, and 2D compositing work,” reveals Framestore VFX supervisor Christian Kaestner (Captain Marvel). “Obviously, as you do the movie, the director will say, ‘I want it to actually look like this.’ Then all of a sudden, you’re in art-directed skies and clouds. Since we were using our own renderer, Freak, we

3D WORLD April 2020

40

3dworld.creativebloq.com

were able to implement the latest render technologies and algorithms and optimise them where we could. The render algorithms are written in a way where they react specifically well to all possible lighting scenarios. But above the clouds we were exposed to bright sunlight, so we could cut corners in our shader algorithms. “On top of that scenes were being intercut between six RED Monstro camera array stitches and CG shots. We ended up rendering high-resolution cube maps which we were able to tile to our desire. The big cloudscapes above the clouds were over 100 rendered tiles, so if a 50mm lens was needed we would not be above the resolution. Volumetrics were then dressed in-between the camera and basket, and between the basket and the distance for parallax and sense of travel.”


The Aeronauts

Mammoth undertaking A REPLICA OF THE MAMMOTH BALLOON WAS PHYSICALLY AND DIGITALLY BUILT FOR THE PRODUCTION

Plates were shot in New Orleans by Morin to get some different altitudes, and Framestore provided an iPad app called FarSight that enabled the 360-degree stitch to be viewed and helped to determine such things as the placement of the sun. “Luckily, we also had South Africa plates that Tom selected which gave us an idea of his sense of what the cloudscape should be,” remarks Kaestner. “Tom had a good vision of what he wanted for the storytelling and worked closely with Martin Macrae in the art department to make meticulous concept art of establishing shots and of what the cloudscapes should feel like, including the colour palette, the amount of scattering and the desired lighting. It gave us a good guide and we could focus on the technical side of things.” The concept art of the cloudscapes was indispensable. “It’s such a

subjective thing so it was tricky to get the right look that was suitable for each need,” notes Framestore CG supervisor Benjamin Magana. “Our camera layout had to have a rough way of looking at the environment.” There was custom modelling to the concept art and procedural generation of basic shapes with additional surface detail for the cloudscapes, explains Kaestner. “One scene was 60km wide and had 550 instances of different clouds.” A big story point was for the audience to be able to distinguish between the different altitudes. “Because we had basic detailed information for every stage of the balloon ride, such as how fast it would ascend and how high it should be, the layout department took on an important role because the environment department would then accurately put the 3D WORLD April 2020

41

Middle (top-bottom): The clouds are rendered with The Mammoth positioned in the frame Further clouds are rendered to give more depth to the imagery The final frame with clouds merged with the sky and sun Above: The Mammoth soars through cloudscapes which were a combination of plate photography and CG

3dworld.creativebloq.com

“The primary thing that took David Hindle and I by surprise when first meeting Tom Harper was his insistence on making a real balloon,” recalls co-production designer Christian Huband. “Our instincts as filmmakers would be to resist that and say, ‘Hold on a minute. If we’re going into space we don’t need to make the space shuttle. We’ve got to think about this practically.’ But in this particular case having the real balloon gave us a cipher that informed everything else that we then did.” The physical build also served as a basis for the digital double. “We LiDAR scanned the real basket and extrapolated from that,” explains Framestore CG supervisor Benjamin Magana. The Mammoth was a heavy digital asset that needed to float in huge cloudscapes and withstand extreme close-ups in various conditions ranging from dry to wet to frozen to destroyed. “The rig, detail and resolution of the balloon had to be insanely high,” remarks Framestore VFX supervisor Christian Kaestner. “For the frozen version, we could do a simulation of a bulged balloon when the cloth pushes through the net, but on top of that the effects department could run particle and frost simulations.” Damage continuity had to be maintained. “In the storm one of the ropes break,” states Kaestner. “We always had to make sure that the orientation of the layout was accurate to what it needed to be, so the ladder was always in the right spot in relationship with the hanging basket.”


sun positions and haze,” remarks Kaestner. “For example, there is less haze at 30,000 feet so you will have a different visual result.” Wispy clouds and rain can exist at the lower altitudes whereas high up is extremely dry. “Sometimes in Montreal when it’s extremely cold and the sun is out you get ice particles floating about. I suggested this to Tom and he was like, ‘Okay.’ As the show progressed, we were able to dress them in as needed. Towards the end of postproduction, Tom kept on saying, ‘Christian, those ice particles are gold!’ It was helping to tell the story so much and added a little bit of life to the photography.” A major action sequence has Amelia Wren (Felicity Jones) climbing the frozen balloon in an effort to reignite the extinguish flame. “One of the biggest challenges was the curvature of the set piece didn’t quite match our CG build,” reveals Kaestner. “For some of the shots we roto Felicity Jones out of the plate and manually realigned all of the knots, nets and ropes because some of it needed to be extended down anyway as the set wasn’t high enough.” The only practical element shot in a cold environment were the breaths. “After the shoot was done and we had the cut, Louis and our compositing supervisor Anthony Luigi Santoro planned out the lines and breathing that we needed. Breath elements were

Weathering the storm THE MAMMOTH GETS CAUGHT IN THE MIDDLE OF A VIOLENT THUNDERSTORM WHICH WAS A COMBINATION OF PRACTICAL AND CG ELEMENTS…

“You’d be surprised how many stunts Felicity Jones and Eddie Redmayne were actually doing themselves,” notes Kaestner. “There was a real attempt from special effects as well to have a suspended basket that had the right angles and the correct number of ropes with the big ring so it could rock about. But once we put all of that together in a sequence it didn’t have quite the right physics, so we ended up doing lots of post camera moves.” “We had to build around the stunts and connect them to the shots where James and Amelia are digital doubles hanging off of the balloon,” explains Magana. “The layout department built the base of the pacing of the sequence by getting the correct dynamic of the balloon going in and out of those clouds, in order for the stunts and digital double shots to make sense together. It took us a couple of goes between having the whole sequence laid out and looking at it in terms of timing, rotation and speed of the shots.”

3D WORLD April 2020

42

3dworld.creativebloq.com


The Aeronauts Opposite: Rough animation blocking of the additional ropes being attached to the balloon, and the final shot with the CG ropes integrated into the plate Left and below: A live-action plate of Eddie Redmayne having an encounter with a butterfly while Felicity Jones looks on, and the final shot with the CG butterfly situated on Redmayne’s hand

shot in ‘the fridge’ against black. We took accurate measurements of everything in order to put the breath elements back in at the proper scale. In Nuke, we used the head tracking system for each shot that needed the breath element and put those elements onto cards. The head movement did not always correspond with the breath element, so we needed to be clever when releasing the card into space before the head was turning. There were over 250 breath elements that were heavily art directed per shot just to get the right sense.” A swarm of butterflies makes a dramatic appearance. “It was something that we kept working on because we wanted to change the look of the butterflies,” states Magana. “We had to extrapolate

from the hero butterfly to the swarm, which was a particle simulation that had cycles of butterflies flapping their wings. It was a huge swarm. We had to devise how to get the right motion blur for the particles and make the animation cycle for the flapping of the wings as accurate as possible.” There were some practical butterflies on set that were all replaced by CG versions. “The one on the finger of James Glaisher was a rotomation of the actual one,” explains Kaestner. “We have done butterflies before but more like a hero creature. In this show, there were 7,249 butterflies that needed to look the same even if we couldn’t afford to render them all with the same groom. We visually decided what was necessary for each 3D WORLD April 2020

43

“We had to make the animation cycle for WKH ÁDSSLQJ RI WKH ZLQJV as accurate as possible” BENJAMIN MAGANA, CG SUPERVISOR

3dworld.creativebloq.com


Images courtesy of Amazon Studios and Framestore

The Aeronauts

group of butterflies. The hero ones had high-detailed scattering with a groom on their wings so it would feel like velvet, but the light needed to be coming through so you can see these little strands.” The liftoff of the Mammoth was an entirely different challenge compared to aerial scenes. “Vauxhall Gardens itself was our biggest set build with the arena, platform, stands, fairgrounds and stalls,” notes Huband. “Our art department team did a 3D model of the stands so it could then be sent on to Louis to build in whatever medium was needed to work in visual effects. For a while we weren’t sure how much we were going to be permitted to build and whether it was going to be the lower story. In reality it became two stories. The only constraint in fact was the number of extras

and costumes that we could afford to populate these stands. There was no point in building an empty stand.” The sequence was more straightforward for Framestore who are used to building cities for other projects. “We had this epic 2,000-frame shot as Amelia and James slowly fly in a helicopter plate, then we took over and it became a full CG shot. We built a library of buildings that included residential and factories as well as smoke stacks and boats. We dressed in crowds with horses and carriages. Because it needed to feel like London of 1862, historical maps were projected on the floor and we laid out our buildings. London is distinct in its house layout and roads are recognisably crooked to each other. We used historical data and did instancing of various versions of numerous models.” 3D WORLD April 2020

44

Top: The largest set build was Vauxhall Gardens, which required bluescreen set extensions Above right: Concept art of Amelia Wren as she collapses in exhaustion after climbing to the top of the frozen balloon

3dworld.creativebloq.com

“Building the real balloon and having the privilege of somehow getting Eddie and Felicity up in that was the biggest challenge,” notes Huband. “They flew and even crashed. The insurers would be freaking out if they had known! To be able to do that is a career high. Seeing Helen Bailey, the stunt double for Felicity, climb the balloon at 2,000 feet up in the air and then get to the top and ask, ‘Do you want me to do it again?’ She was mad and amazing! I can’t wait for people to see Amelia climb to the top of the balloon. I’m hoping that it feels like Free Solo or that sort of precipitous, or even worse in a way because it’s so wobbly and bouncy – that’s not something you’ve seen in any films. For me, the inherent challenges in that were difficult and rewarding in equal measure.”



Practical tips and tutorials from pro artists to improve your CG skills

3D WORLD April 2020

46

3dworld.creativebloq.com


Learn how to render like Pixar part 3

FOLLOW THE VIDEO http://bit.ly/3DW-robots

RENDERMAN | MAYA

LEARN HOW TO RENDER LIKE PIXAR

Unlock the secrets of Pixar's RenderMan with VFX artist Rusty Hazelden

AUTHOR

Rusty Hazelden Rusty Hazelden is a visual effects artist, writer, and YouTuber based in Halifax, Nova Scotia. He makes videos about movie visual effects techniques. YouTube.com/RustyHazelden

F

or over 30 years Pixar’s RenderMan has been used in the film industry to render movies featuring groundbreaking animation and visual effects. In this four-part tutorial series, we’re going to discover how to harness the power of RenderMan by taking an animated shot from start to finish, and learn all of the techniques required to create a photorealistic animation using Pixar’s RenderMan for Maya. Using RenderMan to create photorealistic renders has never been easier. In this series we’re going to begin with untextured objects in Maya, and learn how to create surface materials, set up lights, adjust camera attributes, customise the render settings, and batch render the final animation to disk using Local Queue as a series of EXR images. By the end of the project, you will have a solid

3D WORLD April 2020

47

3dworld.creativebloq.com

PART 3/4

understanding of the workflow used to light and render a dramatic night-time scene using RenderMan for Maya. Along the way, you will learn numerous tips and tricks that will come in handy on your next RenderMan project! In part three of this tutorial series, we’re going to explore the powerful Alembic workflow in RenderMan and find out how to assign materials to Alembic archives using the Dynamic Rules Editor. Once we finish texturing the objects, we’re going to learn how to create a dramatic lighting setup that evokes the feeling of a work light outdoors at night.

DOWNLOAD YOUR RESOURCES For all the assets you need go to http://bit.ly/3DW-robots


Learn how to render like Pixar part 3

01

02

03

04 & 05

01

OPEN THE MAYA SCENE

Let's continue from where we left off in the previous part. Go ahead and open your Maya scene from the end of part 2. Select the R_Nail object in the Outliner. Switch to the Rendering menu set, and from the Lighting/Shading menu, select Assign Existing Material>Nail_Material. This will apply the Nail_Material to the R_Nail object. The rest of the nails scattered throughout the scene are loaded from an Alembic file.

02

CHECK THE ALEMBIC SETTINGS

Select the Scattered_Nails object in the Outliner, and then switch to the Scattered_NailsShape tab in the Attribute Editor. This tab contains many useful settings that will affect how the Alembic file is rendered in RenderMan. There are a few important settings that you should always double-check before rendering an Alembic file. Expand the Render Stats section and make sure that Visible in Reflections and Visible in Refractions are both enabled. Then expand the RenderMan section and make sure the Render Alembic Cache checkbox is enabled.

What is an Alembic file? RenderMan models can be loaded efficiently at render time by exporting Maya geometry to ‘Alembic Archives’ that use the ABC file extension. These files are then loaded in the Maya scene as referenced caches at render time. Alembic files provide a great way to construct large scenes containing thousands of objects.

03

THE DYNAMIC RULES EDITOR

04

CREATE A DYNAMIC RULE

This Alembic file only contains static nail geometry. That means the nail models that are being loaded from this Alembic file are resting on the ground and not moving. If you have an Alembic file containing animated geometry, you may need to enable the Use Global Time checkbox to load animation data. To assign a material to an Alembic file, we're going to use the Dynamic Rules Editor which can be found in the RenderMan shelf. Click on the Dynamic Rules Editor icon to open the editor.

Click the Plus icon in the Dynamic Rules Editor to create a new rule. A rule is used to link a RenderMan material in the Maya scene with an object contained in an Alembic archive stored on disk. Alembic archives are a form of file referencing. The Alembic file is read by RenderMan at render time, but the actual geometry data is not stored in the Maya scene file. The first text field is for the Material ID, which is also known as the payload. Type in ‘Nail_SG’ and press enter. This is our nail material.

3D WORLD April 2020

48

3dworld.creativebloq.com

05

INSPECT ALEMBIC ARCHIVES

06

LIGHT THE SCENE

The next text field is for the rule that will select specific objects contained within the Alembic archive. This is also called the Expression field. The Alembic file contents are listed in the Inspect Alembic section of the Attribute Editor. Double-click on the Scattered_Nails entry at the top of the list in the Attribute Editor. Copy this text, and then click on the Expression field in the Dynamic Rules Editor. Type in a forward slash ‘/’ and paste the text ‘Scattered_ Nails’ into the Expression field. Press enter to complete the rule. After modifying a rule you'll need to restart the IPR render to update the Maya Render View. Then close the Dynamic Rules Editor.

Let's zoom out a bit in the perspective view, so we can see the whole scene. It's a good idea to use a simple lighting setup during the texture development stage so you have a consistent, bright environment while developing the materials. Now that we've textured the work light it's time to move on to lighting the scene. We're going


Learn how to render like Pixar part 3

Many styles of lights RenderMan provides a wide variety of lights. The most commonly used lights include the PxrDomeLight, which uses an IBL image to illuminate the scene, the PxrEnvDayLight which simulates a physical sky, and there are assorted shaped area lights as well.

06

to learn how to create a more dramatic lighting environment that evokes the feeling of a work light outdoors at night.

07

DOME LIGHT INTENSITY

08

MOONLIGHTING IN RENDERMAN

09

ROTATE THE DOME LIGHT

10

SWITCH IT ON

The first light we need to adjust is the dome light. Right now it looks like a bright day, so let's start the process of changing this into a night-time environment. Set the Current Time to frame 0 so we are working on the first frame of the animation. In the Maya Render View, let's start an IPR render. Right-click on the IPR Clapboard icon and select the Render_Cam. We're going to start by changing the intensity of the dome light to resemble moonlight. Select the PxrDomeLight object in the Outliner, and reduce the Intensity to 0.15 in the Attribute Editor.

The dome light is now darker, which is the first step in creating a convincing moonlight. The next step is to change the colour of the light to make it cooler. We can tint the colour of the dome light using the colour temperature controls. Turn on the Enable Temperature checkbox to display the colour Temperature field. Colour temperature values are measured in Kelvin. Type ‘15000’ in Temperature for a cool-blue tint.

07

The moonlight looks pretty good at this point. It's darker and has a nice cool-blue tint thanks to the colour temperature adjustment. Since we're going to turn on the light bulb in the next step, let's rotate the moonlight so it provides a rim lighting effect. That means we're going to rotate the moonlight so it lights the side and back of the work light. Click on the PxrDomeLight in the Outliner and then switch to the object's Transform tab in the Attribute Editor. Type ‘-180’ in the Rotate Y field. This will rotate the dome light and provide nice rim lighting.

08

Now let's turn on the light bulb! Select the Bulb object in the Outliner and click on the

09 3D WORLD April 2020

49

3dworld.creativebloq.com


Learn how to render like Pixar part 3

10

11

12

13

PixarMeshLight icon in the RenderMan shelf. This has created an emissive light source based upon the Bulb geometry. In the Outliner click the plus sign next to the Bulb object and select the PxrMeshLight to display the mesh light in the Attribute Editor. Switch to the mesh light Shape tab. The mesh light converts the surface of an object into a light source with many of the same controls found on a standard area light.

11TURN IT UP

Well, the work light bulb is turned on, but it needs to be much brighter. Both Intensity and Exposure can be used to make the light bulb brighter. The Exposure control is easier to work with as the illumination level gets higher. Set the Exposure to 3.5 to increase the light output of the mesh light. The Exposure control provides a traditional photographic way of adjusting light levels in RenderMan. This Exposure adjustment would be described as increasing the light level by 3.5 stops.

12

A WARMER LIGHT

To make the light bulb warm, like a traditional tungsten

Bring in the MeshLight If you have a piece of geometry that needs to emit light into the scene, consider using the PxrMeshLight. This approach will render faster and produce less noise in the scene, compared to using emissive surface materials for the same effect.

light source, let's use the colour Temperature control again. Turn on the Enable Temperature checkbox. The default value of 6,500 Kelvin produces cool, white light. Tungsten lights emit warm, yellow light around 3,000 Kelvin. Let's type ‘3000’ in the Temperature field to warm up this light. This looks much better. Now we have cool moonlight in the background with a bit of rim lighting on the reflector and the light bulb is casting warm, yellow light in the foreground, illuminating the nails.

13

CREATE A FILL LIGHT

14

POSITION THE LIGHT

The front of the work light handle is still a bit dark, so let's create a fill light to brighten up this area in the scene. RenderMan provides a wide assortment of lights. Let's create an area light using the RenderMan shelf. Rightclick on the Analytic Lights icon in the RenderMan shelf, and select the PxrRectLight menu item. The rectangular area light is created at the origin in the scene.

Lights can be easily moved in the Maya Viewport using the standard translation, rotation,

3D WORLD April 2020

50

3dworld.creativebloq.com

and scale manipulators. After experimenting with the placement of the fill light in the scene, let's directly enter the final translation, rotation, and scale values in the Attribute Editor. In the Translate field, type in -8, 22, and 22. Then in the Rotate field, type in -45, -36, and 0. In the Scale field, type in 50, 50, and 25.

15

MATCH COLOUR TEMPERATURE

16

REFINE THE INTENSITY

Adjust the perspective view so the PxrRectLight is visible. The light is now positioned above the work light and provides extra fill illumination on the cord, handle and nails. Switch to the light's Shape tab in the Attribute Editor to display the lighting controls. Let's adjust the light's colour temperature so it matches the light bulb. It's often a good idea to match the colour temperature of your primary light sources so the lights blend together nicely. Turn on the Enable Temperature checkbox and type ‘3000’ in the Temperature field.

Objects close to lights can sometimes become overexposed.


Learn how to render like Pixar part 3 photography. Light linking allows us to choose specific objects in the scene that a light source can cast light on. The Light Linking Editor works in either Light Centric or Object Centric mode. From the Lighting/Shading menu, select the Light Linking Editor>Object Centric option.

14 Let's use an advanced lighting control to adjust the light falloff. In the Attribute Editor expand the Refine section and scroll down to the Intensity Near Distance control. This control allows us to tone down the intensity of the light that strikes nearby objects. Type ‘40’ in the Intensity Near Dist field to see the effect. We now have more uniform lighting on the nails near the PxrRectLight.

17

LIGHT LINKING FOR MORE CONTROL

To make the lighting in the scene more dramatic, we're going to use a Maya feature called light linking to limit the fill light's illumination on the floor. Light linking is a neat feature unique to 3D rendering that would be very challenging to achieve in traditional studio

18

OBJECT CENTRIC MODE

19

RENDERMAN LIGHT FILTERS

Light linking takes place inside the Relationship Editor window. Select the floor on the left side of the window. Then deselect the PxrRectLight on the right side, and close the Relationship Editor window. Using light linking is a great way to create dramatic lighting by limiting the illumination on the floor while still allowing the rect light to illuminate objects elsewhere in the scene.

Set up a portal To enhance the quality of your interior global illumination lighting, place PxrPortalLights in the openings of your set walls at each window location. Portal lights will help to reduce noise in the final rendered images and cut down on render times.

Now the scene has more contrast between the warm light bulb and the cool moonlight. The scene looks rather nice at this point, but there's one more lighting trick that can make it even better! We're going to add a RenderMan rod light filter to the scene to control the light falloff. A rod light filter is useful for limiting the lighting to a specific region in

16

15

18

17

19 3D WORLD April 2020

51

3dworld.creativebloq.com

the scene. This will create an artdirected ‘pool of light’ around the work light that will gradually fade into darkness.

20

CREATE A POOL OF LIGHT

The rod light filter needs to be assigned to all three lights in the scene. The easy way to do this is to have all the lights selected when you initially create the rod light filter. Select the PxrRectLight, the PxrDomeLight, and the PxrMeshLight_Bulb objects in the Outliner. Then right-click on the Light Filter icon in the RenderMan shelf, and select the PxrRodLightFilter menu item. The rod light filter is now visible in the perspective view, as a small dome shape. Press the F key on your keyboard to adjust the view.

21ADJUST THE FILTER SIZE

Let's change the IPR camera to the perspective view so we can more clearly see the effect of the rod light filter. Adjust the perspective view so we are looking down on the scene from above. Next, we're going to adjust the scale of the rod light filter to create the ‘pool of light’ effect. Select the PxrRodLightFilter object in the Outliner and then click on the Transform tab in the Attribute


Learn how to render like Pixar part 3 Editor. Change the scale to 65 on all three axes. This will define the overall size of our ‘pool of light’ in the scene.

22

INVERT THE ROD LIGHT FILTER

Let's switch to the Shape tab in the Attribute Editor to adjust the look of the PxrRodLightFilter. Expand the Multiplier section and enable the Invert checkbox. Now the scene illumination is limited to the circular area inside the rod light filter. The rod light filter is shaped like a ‘dome’ in the Maya Viewport. This dome shape is a useful tool for visualising the size of the rod light filter and gauging where the ‘pool of light’ will be visible in the scene. To refine the light falloff, expand the Rod Shape section and change the Radius to 0.5. Then set the Rod Shape Edge value to 0.5.

23

SWITCH TO THE RENDERCAM

The scene is looking rather impressive at this point! We now have all the lights in place, the colour temperature adjusted to produce a night-time moonlit environment, and we have created a ‘pool of light’ effect using a RenderMan rod light filter. Let's now switch over to the Render_Cam in the Maya Viewport, as this will be our point of view for rendering the final work light animation. From the Panels menu in the Maya Viewport select Perspective>Render_Cam.

20

21

22

23

NIGHT24 DRAMATIC TIME LIGHTING

Right-click over the IPR clapboard icon and select the Render_Cam. We have really nailed the dramatic lighting! At this point, let's save our Maya scene and we will continue this project in part four of the tutorial series. In the final part of this step-bystep series, we're going to finish the work light scene by adding depth of field and motion blur. Then we will review the RenderMan render settings and launch Local Queue to render out the final animation to a series of EXR files. We're also going to learn how the RenderMan denoiser can be used to slash render times! •

24 3D WORLD April 2020

52

3dworld.creativebloq.com



FOLLOW THE VIDEO http://bit.ly/3DW-robots

CINEMA 4D | REDSHIFT | X-PARTICLES

TURBOCHARGE YOUR MOTION GRAPHICS Find out how to use the Redshift Object tools to enhance viewport performance for motion graphics in Cinema 4D

C

inema 4D from Maxon has had a long-deserved reputation as a leader in creating advanced motion graphics. With the acquisition and integration of the GPU rendering powerhouse Redshift into the Maxon family, motion graphics solutions have received a speed and efficiency boost. This is because Redshift has tools that can create fast proxy geometry, which can drastically speed up responsiveness in Cinema 4D, both in the viewport and at render time.

In this tutorial, we will look at how powerful the Redshift Object tag can be for creating particle and spline geometry on the fly with X-Particles and the Matrix MoGraph object. All of this can be done without developing actual geometry in Cinema 4D, which can really slow a scene down. We will also see how efficient the Redshift Proxy can be at speeding up the Cinema 4D viewport. A Redshift Proxy is a file, still or animated, which bakes out the selected geometry of a scene into separate files. These are 3D WORLD April 2020

54

both fast-loading and easy to use across a range of C4D scenes. At the end of this tutorial, you will have learned skills that are transferrable into all elements of a Cinema 4D pipeline. Redshift Proxies are an incredibly powerful way to leverage large datasets easily in Cinema 4D.

DOWNLOAD YOUR RESOURCES For all the assets you need go to http://bit.ly/3DW-robots

3dworld.creativebloq.com

AUTHOR Mike Griggs Mike Griggs is a 3D and visual effects artist with vast experience across the industry, as both a creator and a technical writer. www.creativebloke.com


Turbocharge your motion graphics

01

CREATE A CENTRAL OBJECT

Make a new Cinema 4D file for every element created, as this allows greater control when compiling the scene later. In the new file for the Central Element, create a sphere. Cinema 4D spheres have a range of geometry options. Choosing an octahedron at this stage gives a distinct pattern to work with. Keeping this as a live 'object' rather than a polygon throughout will mean that it is easy to change throughout the creative process, which will enable more straightforward iteration.

02

EXTRUDE THE SPHERE

04

USE A FIELD FALLOFF

As well as the standard MoGraph objects, Cinema 4D also has the MoExtrude, which can work on geometry within an object. Add a new MoExtrude object and place it under the sphere in the Object Manager. This means that any changes applied to the MoExtrude object are only applied to the sphere. In the attributes of the MoExtrude object, set the Extrusion Steps to 1.

Fast drives When working with Redshift Proxies, the faster the drive, the better the performance. As most computers now come with SSD drives, try and keep proxies on that, as this will give better performance. Once a Redshift Proxy has been played through once, it should be much faster on playback compared to the original Cinema 4D elements.

05

03

ANIMATE THE EXTRUSIONS

Add a Random Effector, then set the Position parameter in Y to the desired amount. This creates a randomised pattern. In the Effector tab of the Random Effector attributes, set the Random Mode to Noise. Adjust both the MoExtrude length and the Position parameter to create a subtle bump effect on the surface of the sphere.

ADD AN HDR TO LIGHT THE SCENE

Cinema 4D comes with HDRs to use, but for this scene I am using the Greyscalegorilla Pro Metals suite of HDRs, which work with their HDRI Link plugin. This provides a quick way of accessing a range of HDRs and can be dragged and dropped. To hide the background, uncheck Enable Background in the Environment setting in the Dome Light attributes.

06

Add a Linear Falloff Field to the Random Effector so that only some of the polygons are affected. The linear field can then be animated without any keyframes by adding the Vibrate tag and setting the rotation to 360º for all three values for the Amplitude. Reduce the frequency to 2 to create a smoother animation which gradually extrudes random faces on the sphere’s surface.

BEVEL THE EDGES USING A REDSHIFT MATERIAL

Create a new Redshift material, and use the Copper preset. To soften the hard edges, add an RS Round Corners node to the Bump Input channel. The Round Corners radius can be adjusted as desired to create a small bevelled edge. Add a Bevel object to the scene and group it with the sphere object with a new null object. Set the Bevel level to the desired radius to create an object that looks more organic, and loses the CGI effect. Again, both of these elements can be adjusted throughout the creative process.

3D WORLD April 2020

55

3dworld.creativebloq.com


Turbocharge your motion graphics

07

CREATE BASIC WEATHERING WITH AN RS NOISE NODE

09

ADD GEOMETRY

Add an RS Noise node to the material and connect it to the Refl Roughness to create some basic weathering. To check what the Noise is doing without having to see ‘through’ the material, the RS Noise can be connected directly to the Surface node, which shows it in the Redshift RenderView. This works with most nodes within Redshift and is a great way to troubleshoot and refine nodes directly. When the Noise is to a scale and contrast that suits, connect it to the Refl Roughness channel.

Add a Redshift Object tag to the Matrix object by right-clicking on it in the Object Manager. Once added a Particles tab will appear in the attributes for the Redshift Object tag. Set the mode to Sphere Instances, and in the Redshift RenderView a small sphere should appear for every point of the Matrix object. Adjust the Scale of the Matrix points in its Transform tab, and the Scale Multiplier of the Redshift Object tag to create an equivalent size in both the RenderView and the Cinema 4D viewport.

11

ADD MATERIALS TO THE MATRIX OBJECT

Now that the Matrix object can be seen in the Redshift RenderView, it can have Redshift Materials applied to it just as if it were a regular object. The Redshift Sphere objects are much quicker to render than a ‘traditional’ combination of a cloner object and a sphere object, and as such viewport and Redshift RenderView performance is faster as well. Create a new Redshift Incandescent Material and apply it to the Matrix object, so that the sphere instances appear as small glowing spheres. 3D WORLD April 2020

56

08

USE A MATRIX OBJECT

10

MOGRAPH

12

CREATE A REDSHIFT PROXY OF THE SCENE

The Matrix object is similar to the MoGraph Cloner object, apart from the fact it produces points rather than clones of objects. For new Cinema 4D artists this can be confusing as the Matrix object does not create anything to ‘see’. Add a Matrix object, set the Mode to Object, then drop the Sphere object in the Object dialog. Points can now be seen across the sphere in the viewport.

The Matrix object works just like a cloner and therefore, so do all the MoGraph effectors. Adding a Plain Effector can push the Matrix points away from the surface, and the same Linear Field Falloff object that animates the MoExtrude object can be used to control this effect. If the effect is to be inverted, it is best to just create a duplicate of the Linear Falloff Field.

Gather all the elements, bar the HDR dome, into a new group Null. Select the Null and from the Object Manager Edit menu choose Select Visible. Go to the main Cinema 4D menu, and select File>Export>Redshift Proxy(.rs) and a palette will pop up. Make sure the Export dropdown is set to Selected Objects, and Add Proxy to Scene is ticked. That animation is set to all frames.

3dworld.creativebloq.com


Turbocharge your motion graphics

13

SAVE IT

15

USE THE PROXY IN THE SCENE FILE

If the Proxy is to be used among multiple files, it might be a good idea to keep it in a separate folder within the Cinema 4D 'tex' folder. When the Redshift Proxy is saved, an individual file is created for each frame, so it is a good idea to save to as fast a drive as possible. When the Proxy creation process has finished, a new Redshift Proxy object will appear in the Object Manager.

The Redshift Proxy object can be added to the scene file that will be used for the main render, either by adding it from the Redshift menu or by copying and pasting the Redshift Proxy from the Sphere Cinema 4D file. A Redshift Proxy can also have different materials added to it just like any object as long as the Materials option in the Attributes palette is set to Object.

14

WORK WITH THE REDSHIFT PROXY

16

MAKE TRAILS OBJECT USING X-PARTICLES

If the Proxy files are being kept in the local 'tex' folder, reduce the path to just the file name. Set the Preview to Mesh and hide the original object group. A grey version of the shape without the matrix dots should appear in the Cinema 4D viewport. In the Redshift RenderView, the Proxy object should look identical to the original object. The most significant noticeable difference is in playback speed in the Cinema 4D viewport, which can actually be up to ten times faster.

In a new Cinema 4D file, created at the same level as the other files, we will make the Trails object. Create a Null Group and within it, place a Platonic object and an Arch spline. Add an X-Particles system and set the Emitter Shape to Object and drag the Platonic object to the Object dialog box. Apply an Align to Spline tag to the Platonic and choose the Arch as the spline. Set two keyframes for the Position in the Align to Spline at 0 and 100% at 0 and 100 frames to create an animated movement with particles.

17

USE THE XPCONSTRAINTS TO ‘CLUMP’ THE PARTICLES Reduce the number of particles to around 1,000 from 100 and reduce the speed to a couple of the centimetres. The xpConstraints modifier, which is added in the Dynamics part of the X-Particles system, can be used to make the particles move as if they were trapped in liquid and also attract themselves to each other. Play with the settings in the Connections, Forces and Viscosity tabs in the xpConstraints tag to get a feeling of movement for the particles.

3D WORLD April 2020

57

3dworld.creativebloq.com


Turbocharge your motion graphics

18

UTILISE THE XPSPEED MODIFIER

Even with the xpConstraints modifier, particles can still drift away from the spline. To get this under control, use an xpSpeed modifier with a Spherical Field Falloff. Activate the Clamp Maximum speed in the xpSpeed attributes to a very low value. This will not stop the particles, but prevents them from drifting off while still giving them enough momentum for the attraction settings from the xpConstraint tag to work. Adjusting the size of the Spherical Field Falloff can be a good way to bring more life to the particle system.

19

ADD GEOMETRY TO X-PARTICLES SYSTEM

This would be an excellent point to create an xpCache object, and save the cache to the 'tex' folder in its own cache folder to keep everything local. To ensure the Redshift Proxy will work properly, and because the article count is relatively small, create a duplicate of the Platonic object, create a xpGenerator object and make the duplicate Platonic a child of it.

LINKING ELEMENTS 20 THE

To create the web of the trail, use an xpTrail object to create spline elements between the particle points. Experiment with the settings in the xpTrail object, as there are a lot of different ways to connect the particles using the Algorithm dropdown. Try and find a balance between creating obvious connections and something that still feels delicate like webbing. The particle movement from the xpConstraint will still apply to the movement of the threads.

21

VISUALISE THE XPTRAIL MODIFIER

Add a Redshift Object tag to the xpTrail object. There is now a Curves tab. Choose Strips from the Mode dropdown, which will create basic polygon strips in the Redshift RenderView. As Redshift is making this geometry rather than Cinema 4D, it provides quicker feedback for the Cinema 4D viewport and the Redshift RenderView, than using the traditional C4D Sweep object method.

22

ADD ILLUMINATED TEXTURES

Create two RS Incandescent materials of opposing colours and add one to the xpTrail object and the other to the xpGenerator object. Create a new RS material for the original Platonic object, and using the Redshift Shader Graph, copy the RS material nodes from each of the two RS Incandescent materials into the Shader Graph of the Platonic material. Use a Material Blender node to combine the two materials, using the Wireframe node as a matte between them.

3D WORLD April 2020

58

3dworld.creativebloq.com


Turbocharge your motion graphics

23

CREATE THE RS PROXY OF THE TRAILS

Collect all of the relevant objects into a parent Null and go to File>Export>Redshift Proxy(.rs). Again create a folder within the master 'tex' folder for the Proxy files and allow Cinema 4D to create a file for each frame. When finished, a new Redshift Proxy object should be in the Object Manager. Clear the Path to just the file name and copy this Proxy object into the Scene file.

24

CLONE THE X-PARTICLES REDSHIFT PROXY

As far as Cinema 4D is concerned the Redshift Proxy is just another object, so using it as the driver for a Radial Cloner object is a great way to make the most of it. As everything in this Cinema 4D file is a proxy, the playback is rapid and is not reliant on X-Particles or MoGraph caching, allowing the scene to be quick, and this allows the artist to concentrate on composition. Play a sequence through in the viewport, and it should be faster on the second play as the RS Proxy files will have been loaded into the viewport.

Updating Proxies

25

CLONE THE REDSHIFT PROXY CLONES

27

SET BOKEH WITH THE CAMERA TAG

The Redshift Proxy system is so robust that an Instance can be created of the main elements, dropped into another cloner and used as a background object. If different materials or more advanced animation are required, just duplicate the original group and drop that into the cloner. These methods will be quicker with Redshift Proxies than with normal Cinema 4D objects.

To create a realistic bokeh, add a Redshift Camera. When it is in the Object list, first of all, enable Bokeh in the Redshift Camera tag. In the Bokeh tab, click on the Override and Enabled checkboxes. Use the CoC Radius to control the amount of bokeh, and keep the Power at 1 or higher. Using this method will keep realistic highlights and should prevent grains.

26

LIGHT THE SCENE

28

ADJUST REDSHIFT RENDER SETTINGS

Add a Redshift Environment object to create both volumetrics and bokeh. Add a Redshift Area Light in the background to create volumetric effects. In the Volume tab set the Contribution Scale to 1, which will blow out the image, and use the Area Light Intensity Multiplier in the General tab to control the amount of volume effect. This method tends to give a cleaner result at render time.

Redshift Proxies can be easily updated, primarily if each proxy element is localised to its own Cinema 4D file. Just use the same process to create a Proxy as outlined in the tutorial and save with the same file name in the same location, and the Proxy files will be rewritten. In the scene file Redshift will automatically recognise the new Proxy and update the RenderView with the new proxy element.

To render with Redshift, first of all, make sure that Redshift is selected in the Renderer settings dropdown. In the General tab, go to the Unified Sampling settings and adjust the Samples Max to 512 and the Samples Min to 64. If that causes a slow render time, change those settings downwards – as ever it’s a compromise between you and the time needed to get the right combination. Activating the sampling overrides for Light, Volume and Reflections to around 1,024 samples also helps with the quality of the render.

3D WORLD April 2020

59

3dworld.creativebloq.com


Make a photorealistic nature scene

3DS MAX | CORONA RENDERER | BRIDGE (QUIXEL) PHOTOSHOP | NIK COLLECTION – COLOR EFEX PRO | FILEOPTIMIZER

MAKE A PHOTOREALISTIC NATURE SCENE Discover a simple process for creating a realistic-looking render with Megascans

AUTHOR

I

Mohammadreza Mohseni Mohammadreza is a CG artist, tutor, certified architect and professional photographer with over 13 years’ experience in 3D rendering and visualisation. instagram.com/mohseni.mr

n this tutorial, we will cover the process for achieving a photorealistic look in closeup nature scenes by utilising the most straightforward methods possible. Therefore, our priority will be focusing on convenient yet effective techniques to produce the desired result. The path to a realistic render has never been easier with the use of invaluable Quixel assets (Megascans) and apps, namely Bridge, which is an efficient tool for quickly and accurately importing resources to 3ds Max to achieve an instant realistic result. However, the manual process of using Quixel 3D scan assets will also be covered too. Additionally, we will explore rendering and lighting by seamlessly combining the IBL method with artificial illuminations using Corona Renderer. Moreover,

this tutorial will illustrate how to create chromatic contrast and colour harmony in conjunction with the composition of elements in a frame, and how this can be utilised to help send a message to the audience, establish the desired mood or even tell a story. We will be using Photoshop and the Color Efex Pro plugin for sharpening the details in our image, colour correction and colour grading. Finally, we will conclude by introducing the Caesium Image Compressor as a free image compressor software for sharing the final output render on the web.

DOWNLOAD YOUR RESOURCES For all the assets you need go to http://bit.ly/3DW-robots

3D WORLD April 2020

60

3dworld.creativebloq.com




Make a photorealistic nature scene

TEENY-WEENY SPARROWS A photorealistic nature scene created with 3ds Max and Quixel Bridge

3D WORLD April 2020

61

3dworld.creativebloq.com


Make a photorealistic nature scene

01

INITIAL MODELLING

For this scene I used 3D-scanned assets from Megascans for the models, including some rocks and stumps, and mixed them up together in 3ds Max to create a natural-looking environment with a very high level of detail, for producing a realistic render. I have used the high-poly versions of those 3D scans, but you can also get nice-looking results with a LOD 0 version of it for a close-up shot like this scene.

02

QUIXEL BRIDGE INTRODUCTION

Quixel Bridge is an app that provides easy access to a huge library of 3D-scanned assets called Megascans. Moreover, efficient management features are another benefit that comes with it. However, the best part of Quixel Bridge is the one-click integrations to major 3D software like 3ds Max, Maya, Unreal and so on. Using this feature, you will get an instant model with proper materials and shaders based on the chosen render engine. To get started with this program after installation, click on Edit>Settings and make sure you have an updated version. Then simply choose the main program that you want to integrate to Bridge, which in the case of this tutorial is 3ds Max.

Consider your resources and optimise You may need to optimise the model using the ProOptimizer modifier in 3ds Max if your PC system resources are not quite up to the task, or the LOD version does not seem to have enough detail for your taste.

02

03

03

DOWNLOAD MEGASCANS ASSETS

Megascans assets can be downloaded from the Quixel website directly and also from Quixel Bridge (in this tutorial Bridge has been used because of its benefits). Firstly, you need to login to your Quixel or Epic (if you want to use it for Unreal Engine) account in Bridge, after that you may download and use the assets. Search for an asset from the search bar on top of the assets grid; the categories may help you to find the right assets. After selecting an asset, a sliding window will appear with Download Settings, with which you can tweak texture resolution, the material preset, and different channels of that asset can be determined to download. After setting up the windows just click on the blue Download button.

04

QUIXEL BRIDGE EXPORT

After selecting and downloading the right asset for your purpose, export it to the application of your choice, which here is 3ds Max. Next install the

01 3D WORLD April 2020

62

3dworld.creativebloq.com

Megascans live link script in 3ds Max. To install the script (which is written in Python), find the Copy link in blue under the Script title in Export Settings, then click on it and the script will be copied to the clipboard. Furthermore, go to the Scripting menu and click New Script. Paste the script and use Evaluate All (Ctrl+E) in the Tools menu to run it. Then you can export assets with just one click to 3ds Max.

05

USE MEGASCANS WITHOUT BRIDGE (CORONA)

There are some occasions when assets from the Megascans library need to be used directly, not by using Bridge. For example, you do not have access to Bridge for some reason, or the export does not work as expected, which has been a common issue since Epic and Quixel’s co-operation. To use Megascans in Corona the process is fairly simple, but you may need to arrange the material as demonstrated in the step screenshot. To sum it up, in the Diffuse slot it is better to mix AO


Make a photorealistic nature scene

04

05 Faster environment lighting alternative

with Albedo and then apply it. For reflection use pure white and for glossiness use the related map with gamma 1.0 applied. However, it is a little bit tricky to set up Fresnel IOR; we do not have a specific map in Megascans for this slot, so we need to use the specular map and convert it to Fresnel by using an LUT, and apply a Color Correction to set up the gamma/contrast. Additionally, for Bump use a normal map with gamma 1.0 through the Corona Normal Map node (with an additional bump map). Optionally, you can also add a displacement map with gamma 1.0 directly to the material.

06

It is possible to use Corona Sky rather than HDRI to get faster rendering results, however, it may provide less lighting and shadow details.

07 COLOUR PALETTE

After choosing and reviewing the assets, the dominate colour of them can be extracted easily. In this scene the dominate asset colour has an orange hue. So, I have gone for the complementary colour harmony algorithm and selected two main colours for it. Then I created some supporting colours in case of need, which you can see in the image.

DISPLACEMENT SETTINGS

In order to get nice surface detail from Megascans assets, displacement would be a great tool. In Corona 5, there is a new setting to use the 2.5D displacement map, which is really helpful for achieving clean and detailed displacement without being too heavy on RAM. Turn this on, and we will also use autobump as an additional help. Autobump is hidden in the Development setting of Corona: you may need to go to the System tab and System Settings>Enable devel/debug mode to access it. Then, just set the screen size in Displacement from the Performance tab to get the quality you may need. Here I have used 1.5px to achieve my desired quality.

07

06 3D WORLD April 2020

63

3dworld.creativebloq.com

CHROMATIC CONTRAST 08 GENERATE

As a 3D artist the easiest way to create contrast in a scene is to use colours that are opposite each other on the colour wheel. This can create eye-catching and clear images, as it is a chromatic technique that doesn’t rely on harsh lighting. So, the next step would be adjusting the shaders


Make a photorealistic nature scene

09

10

08 and lights with that in mind. For lighting, it can be split into two parts, which will be covered in the next steps.

09

ENVIRONMENT LIGHTING

Environmental lighting can be considered one of the most important parts of realism in my humble opinion. Thus, to reach the most detailed result, this scene has been lit by an overcast HDRI (from Peter Guthrie), and the Corona Color Correct node was used to edit the exposure, gamma, saturation and colour temperature to get a more contrasted, less saturated but blue-tinted colour to match my desired colour palette.

10

Asset colour for lighting It’s important to carefully set the position of the light to sufficiently emphasise the light contrast in conjunction with the asset diffuse colours, which in this scene is the trunk and sparrows.

mixed with a Corona color map, with a highly saturated orange colour, and then added to the mentioned light.

the moody effect and then set a negative value for the directionality to get a backward scattering. That’s it, so simple.

11

12

CREATE A MOODY LIGHTING

Moody lighting enhances the realism in a nature shot, because it is mostly the case in this kind of environment caused by the scattered dust. Thus, it will be easy to reproduce that subtle effect, just add a volumetric material in the Global Volume Material slot in Render Setup (F10)>Scene>Scene Environment and then set up the colour and scattering effect. Tweak the distance to set the density of

ADD LIGHTS TO CREATE CHROMATIC CONTRAST

To produce chromatic contrast in this scene with the desired colour palette, another light is needed to create a warm orange colour in delightful contrast with the bluetinted environment illumination. So, a disk-shaped Corona light would be perfect in this specific situation. A Softbox texture has also been

11 3D WORLD April 2020

64

3dworld.creativebloq.com

CAMERA COMPOSITION

The composition of the key elements in a scene needs careful consideration. The human eye arguably prefers order in an image, so the scene could be composed in our camera frame with this in mind, and it could be managed by using some form of guidelines like the golden ratio, golden triangle and golden spiral, which are all very effective and used by so many great artists. So, I have taken advantage


Make a photorealistic nature scene

15

12

13

14

16

of these guidelines to set my composition, and the result can be viewed in the step image.

13

CAMERA SETTINGS

For this part, you may need some additional research to get a realistic result. After some analysis about camera settings in real-world bird photography, some typical guidelines could be determined that would help to produce a more natural and realistic render. Firstly, use a telephoto or super-zoom lens (Focal l. in Corona Camera) 85mm-600mm. Secondly, activate the DOF, then lower the F-stop to get more bokeh and to separate background from the subject. Usually with a 2.8-6.4 lens you can achieve your desired result. Thirdly, for a nice realistic bokeh use a custom-shaped aperture map, which you can find in Corona Camera>Bokeh>Aperture Shape>Custom. Then use the ISO

Speed up rendering of volumetric shader Use the Single Bounce Only option for scattering in order to speed up the rendering. This will produce a darker and biased result, but in some subtle cases like this scene, it may come in handy to get the desired effect faster.

parameter to get the balanced overall exposure.

14

CORONA SETTINGS

Corona settings are so simple, and you don’t really need to worry about it. But in this scene, there are some parameters that would be helpful to get a noise-free render way faster. The scene has a really shallow DOF in order to procure a cleaner result in a shorter amount of time; set GI vs AA balance to a low value like 4. Also, using the most recent denoiser engine available in Corona, the Intel AI denoiser, denoising would be faster with decent quality, and because of AI technology we can get away with a higher noise level that results in an overall quicker render.

15

CORONA IMAGE EDITOR (CIE)

The render has finished and the denoising process has been

3D WORLD April 2020

65

3dworld.creativebloq.com

applied. So, the post-process has started. I recommend using CXR format to save a Corona output with 32-bit colour depth, which is the perfect file type to work with CIE. Usually, CIE can tone map a render brilliantly to set shadows and highlights to the desired level. After that, Corona’s colour lookup tables can come in handy to give the render an awesome look. Also, you may need to adjust the saturation of colours and vignette and save to 16bit TIFF or PNG to finalise this step.

16

COLOR EFEX PRO (NIK COLLECTION PLUGIN)

One of my preferred tools in the post-production stage is Color Efex Pro, especially for the effect called Pro Contrast. This awesome effect can produce genuinely pleasant visual contrast and extend dynamic range of the image to deliver the render with a unique look. Moreover, there is an option


Make a photorealistic nature scene

17

18

19

20

to apply the correction to the render colour cast. However, it is disregarded intentionally to maintain a warm look instead of the corrected natural look.

17

COLOUR CORRECTION IN ADOBE CAMERA RAW

First of all, adjust the white balance to get colours right; ACR (Adobe Camera Raw) has been used to set temperature and tint correctly. (You could use Auto, then lower down the value if it is too much for your taste.) In the next step, we are going to make the colour more vibrant by using a subtle change in the Vibrance slider (+12), and by altering the Texture and Clarity sliders (+10, +15) we can further extract the details. The Dehaze slider with a negative value would help to get a hazier and moody effect. Lastly, use HSL Adjustment to change hue, saturation and luminance of each colour channel separately, then utilise split toning in conjunction with the Calibration section to achieve the final colour.

Image compression in Photoshop There is a feature in Photoshop called Save For Web (File>Export) which can compress the resulting image nicely. A Quality of 60 or higher would be good enough.

18

ADD CHROMATIC ABERRATION

Chromatic aberration is almost always captured by real-world DSLR cameras because of the lens elements design. So, many artists tend to reproduce this effect in their 3D work using various methods. However, the common problem in the community is exaggerating the intensity of it. Be very careful to make the effect more subtle for a more pleasing look. To apply chromatic aberration head to Filter>Lens Correction>Custom, and use the sliders in the Chromatic Aberration section to adjust a minimum amount.

IN PHOTOSHOP 19 POST-PRODUCTION

One of the main points in postproduction is sharpening an image. In this step High Pass sharpening has been used in order to deliver a flawless detail sharpening. Just duplicate the layer created via Nik Collection, then change the blending mode of the layer to Soft

3D WORLD April 2020

66

3dworld.creativebloq.com

Light and apply the filter in Filter> Other>High Pass with a radius of 1 pixel. Then mask out any areas you do not want to be sharpened.

20

IMAGE COMPRESSION FOR WEB

Save the finished image as PNG or TIFF to get a decent quality output, but a more compressed image is essential for sharing the render on the web. Thus, an image compressor software would be needed, namely Caesium Image Compressor. Using this free software, high-quality compression can be achieved with resizing capability to reduce the size of the image even further, for many images at once. A JPG with the Quality slider value between 60-80 would be compressed enough for sharing on most websites without losing any visible quality. Caesium can update the preview in real time, so if the result is not good enough for your taste you just need to increase the Quality slider value.



Create a Disney Pixar-style project

BLENDER 2.81

CREATE A DISNEY PIXARSTYLE PROJECT

Pietro Chiovaro takes you through the full workflow for the creation of a Pixar-inspired scene

AUTHOR

Pietro Chiovaro Pietro is an Italian 3D artist who creates 3D assets and environments, and is currently working on an open-source game. pietrochiovaro.artstation.com

P

ixar Animation Studios is one of the most iconic and influential CGI studios in the world. In 1984 they released the first CGI short movie with characters and a story, The Adventures of André & Wally B. In this short they introduced new innovative technologies and features related to CGI production, such as particle systems and motion blur. Nowadays their style and stories are well known and appreciated all around the world, and there are many different companies and animation studios that try to re-create the cartoony (but at the same time realistic) style introduced by Pixar.

Being inspired by this amazing company, this issue I will show you how to create a Disney Pixar-style project starting from scratch, in particular Andy’s house from Toy Story 4. In the tutorial I will also show you all the main steps for creating the outdoor environment, starting from simple geometries, and how to add all the extra little details and lights that help to give a greater emphasis to the final scene.

DOWNLOAD YOUR RESOURCES For all the assets you need go to http://bit.ly/3DW-robots

PIXAR-INSPIRED A re-creation of Andy’s house from Toy Story 4

3D WORLD April 2020

68

3dworld.creativebloq.com


Create a Disney Pixar-style project

FOLLOW THE VIDEO http://bit.ly/3DW-robots

3D WORLD April 2020

69

3dworld.creativebloq.com


Create a Disney Pixar-style project

01

PRIMITIVE ENVIRONMENT

Once you have imported your main reference in the software, studied the project and have a clear idea of the environment and location, it’s time to create a very simple blockout of the scene. In this step, simply place some primitive geometry following the concept in order to help your initial understanding of the space and assets distribution in the working area.

02

MODEL THE HOUSE

Now we can create a primitive version of the project, and have a clear idea of the context and space. The next step is to start modelling the primitive geometries in order to add more details; in this case I start with the house, subdividing and scaling a simple cube in order to fit the main reference.

03

ADD THE GARAGE

After that I modelled the house in a really simple way, keeping a low number of polygons. I then decided to focus on the garage; I just duplicated the base mesh of the house, pressing Shift+D, and I scaled it down in order to create this second element of the building.

04

CREATE THE HOUSE ASSETS

05

MODEL THE STREET

06

MAKE THE GARDEN

01

02

Ambient occlusion AO is a shading and rendering technique used to calculate how exposed each point in a scene is to ambient lighting. With the right parameter you can increase the depth of an object and the shadow interaction.

03

In order to add further detail to the simple base geometries, we can model all the exterior elements of the house such as the windows, the frames, the doors and the lamps. As for the previous props, a key element is the reference that we have chosen.

This is the first part concerning the outdoor environment of the project, in this step we have to model the road using a simple plane and the footpaths that link all the houses together. We can even add some subdivisions in order to create some imperfection in the asphalt.

04

Once we have finished the street, we can now begin to create the garden which will help us to link the house to the footpath. Here we

05 3D WORLD April 2020

70

3dworld.creativebloq.com


Create a Disney Pixar-style project can create a simple hair particle system in order to make the grass and add some stones by creating some different low-poly spheres.

07

MASKS

08

EXTERIOR OBJECTS

Another really important step, used frequently in the production of movies and videogames, is the use of masks and planes, placed at the top of grounds and walls in order to create some imperfections, and add some extra little details that help us to achieve greater realism.

06

Next we have to model other meshes like the mailbox, trash bin, outdoor lamps, telephone pole and even cars, in order to give more life to the environment.

09 CREATE THE MATERIALS

Some artists hate this step, others love it. I’m talking about the production of materials for every single model. To do this there are many different methods, and I’ll show you some of the approaches in the next steps, but right now we have to create and assign some default materials to every single mesh of the project.

07

10

UV MAPPING

Once all the models have been created, and a base material has been assigned to it, a key point before we devote ourselves to the material production is the unwrapping process. In this step I selected and unwrapped all the meshes one by one. A quick way to do it, if you are not comfortable with the unwrapping method, is using Smart UV Project.

11

METHOD 1 – IMPORT YOUR TEXTURES

08

09

The easiest way to create materials is to import some textures from your own library. It takes a few seconds and you just have to fix the UV map in relation to the textures of the material. The best textures comport the use of a PBR workflow and are usually used to create a realistic style. In this scene you can use your PBR textures for all the materials of the project. If you are looking for a stylised effect, you should follow the next step.

10 3D WORLD April 2020

71

3dworld.creativebloq.com


Create a Disney Pixar-style project

11

12

13

12

14

METHOD 2 – PAINT THE TEXTURES

If you are looking for a different style or you want to add some details to the textures of a model, the best workflow is to paint the texture of every single model. One of the most-used software for this kind of workflow is Substance Painter – essentially you have to import the mesh, bake it and create the material in relation to the mesh. Another simple way is the vertex paint, but with Substance Painter you can create some smart materials to help speed up your daily workflow.

13

METHOD 3 – CREATE THE SUBSTANCE

The third method for creating materials requires software like Substance Designer, the leading software for the production of PBR materials. Substance Designer

Screen Space Reflections Thanks to this effect, all the materials will use the depth buffer and the previous frame colour to create more accurate reflection than reflection probes. This effect will define the reflections in the project at the cost of higher render time.

15 gives you full control for producing a huge variety of material textures using a node-based workflow. In previous issues, I have delivered a selection of Q&As detailing how to create different materials using this software, enabling you to create your personal materials library.

project. Lights add atmosphere and a different emphasis to the scene, expressing emotions and transforming something real to surreal and vice versa. In this step I started to place some point lamps inside the house and some exterior lamps for the street illumination.

14

17

WORLD HDRI

18

VOLUMETRIC WORLD

INTRODUCTION TO THE RENDER ENGINES

TRY DIFFERENT MATERIALS

We did a big part of the project, but that’s not all, since another important step is to change the materials and try new combinations. For example, a concrete wall can be replaced with brick tiles or a wood material with a metallic substance, you just have to experiment with different combinations and choose which best suits your project.

Another important element related to the lighting of a scene is the world lighting, in this case I used an HDRI map. In order to import the HDR file in Blender you have to import the image in the world setting and link it to the background of the world environment, then you can link a color correction node to change the colour of the map.

15

REPOSITION ELEMENTS

Along the lines of the previous step, you should also try changing the position of the elements in the scene. Altering the position of a door, a window and other meshes of the project can help you improve the impact of the scene and express your desired narrative.

In this scene I created a volumetric world in order to produce the foggy environment. The main reason for the creation of the fog is to underline the rain and the sky atmosphere, giving much more prominence to the house. I enabled the volume scatter and set the strength to 0.05 since I was looking for a soft effect.

16

LIGHTING

19

This is an essential step in any workflow, but is especially important in a Disney Pixar-style

16 3D WORLD April 2020

72

3dworld.creativebloq.com

That’s something that we have to decide before we render the project


Create a Disney Pixar-style project

Light bloom Bloom (or glow) is a computer graphics effect used in videogames and CGI movies to reproduce an imaging artefact of real cameras. The effect produces fringes or feathers of light extending from the borders of bright areas in an image, contributing to the illusion of an extremely bright light overwhelming the camera or eye capturing the scene.

17

and start the post-processing: in Blender there are two main render engines, Eevee and Cycles. In the next two steps I’ll show you the main properties of these two engines, in relation to the final purpose of this project.

20

EEVEE

The Eevee engine, introduced with the release of Blender version 2.8, is one of the most important parts of this update. Eevee is a real-time render engine, allowing the possibility to render a project in a few milliseconds (depending on your workstation). This render engine was developed in order to reproduce many features of Cycles in real time.

21

CYCLES

Cycles is Blender’s physically based path tracer for production rendering. It is designed to provide physically based results out-of-thebox, with artistic control and flexible shading nodes for production needs. Its capability in the reproduction of lights, shadows and reflections make Cycles the best choice for any kind of production, but at the cost of an increase in render time.

18

22

CHOOSE THE RENDER ENGINE

23

FINAL RENDER

After taking into account the properties of each render engine, and considering the purpose of this project, creating a one-minute animation of this Pixar-style render, I decided to use Eevee in order to decrease the render time. As I said before, from some technical standpoints Cycles is better than Eevee, but with the right parameters we can decrease this gap.

22

In order to achieve a highquality render with realistic lights, shadows and post-processing effects, you have to enable three settings from the render panel: the AO (Ambient Occlusion), the Bloom and the Screen Space Reflections. If you need it you can increase the shadow cubemap and change the strength of the shadow to soft, then we can run the render process in order to see the final result. •

23 3D WORLD April 2020

73

3dworld.creativebloq.com


AUTODESK MAYA

SIMPLIFY YOUR BLEND SHAPE CREATION

Discover how to create blend shapes with greater speed and efficiency with this quick step-by-step guide

B

lend shapes, or morph targets as they are also known, are a powerful way to bring life to a static model. You create a target shape and then your model can blend into and out of this using a slider, so it’s a nice organic motion, even if it is a linear movement. Traditionally, creating blend shapes could be a long and tedious job. You first duplicate the main model, let’s say it’s a head, edit each vertex to create your expression and then repeat the process for all the other face shapes you need. As you can imagine, on those

more complicated projects this could involve generating 50 or more shapes so the character or creature you’re working on can express emotions. The biggest issue with this approach is if the main head model’s topology changes. What this used to mean was all the target shapes had to be updated too, or in the worst case, redone. There have been lots of improvements in blend shape creation over the years and some shortcuts have been developed to help speed up generation and prevent the need for any work to be repeated.

3D WORLD April 2020

74

I use a mixture of methods which involve utilising the component editor, painting blend shape weight values, and I like to create a blend shape generator of sorts to automate shape creation while also making potential changes future proof. However, the biggest improvement was the introduction of the sculpting tools, and also the Shape Editor.

DOWNLOAD YOUR RESOURCES For all the assets you need go to http://bit.ly/3DW-robots

3dworld.creativebloq.com

AUTHOR Antony Ward Since the early 90s Antony has worked for many of today’s top game and VFX studios, and has written three technical manuals and many online tutorials. www.antcgi.com


Simplify your blend shape creation

01

YOUR FIRST TARGET SHAPE

The Shape Editor can be found on the sculpting shelf. When you first open it, Maya will automatically scan the scene for any existing blend shape nodes and fill the window for you. If none exist all you need to do is select the head model and then click Create Blend Shape. This creates the main container node called blendShape1, but you then need to define your first target shape which the head will blend into. To do this, just select the blend shape node and then click Add Target.

02

THE EDIT BUTTON

What I like about the Shape Editor is there are no separate models to play with, everything is held within the blend shape node. This not only makes for a cleaner scene, but also a smaller file. What you must remember is to hit the edit button next to the target you’re working on, so Maya knows which shape to manipulate. Maya will helpfully give you a warning if you try to work on a target that isn’t editable, but if you aren’t careful you will end up changing a shape you’re not working on, and if it’s not currently visible you may not see the edits until it’s too late.

03

Go deeper As well as having the ability to create and manage all your blend shape targets in one place, the Shape Editor has a whole host of other useful features. You can create inbetween targets and combination shapes too which can help to make your character’s movements even more fluid and organic. If you decide later you prefer to work on separate models, you can also use the Rebuild Mesh tool which will output a mode for you to edit directly without worrying about changing other shapes.

02

03

04

SCULPTING TOOLS

Defining your target shape is made even easier with the help of the sculpting tools. The Grab tool is essential for pulling the face around, meaning you can quickly achieve organic-looking expressions with just a few strokes.

05 Other key tools are the Smooth tool, which you can also access by holding Shift, and the Relax tool. What this will do is spread the distribution of the vertices across the model’s surface, while retaining its shape. This is good for fixing areas where there is texture stretching, and is also handy for fixing the topology.

04 SHAPE EDITING

In addition to the main sculpting options there are also a series of tools geared more towards blend shape creation, and these are found at the end of the sculpting shelf. These are the Smooth Target tool, Clone Target tool, Mask Target tool and the Erase Target

01 3D WORLD April 2020

75

3dworld.creativebloq.com

tool. What’s useful about these is they focus more on the targets you’re working on, so it reduces the risk of inadvertently editing the other shapes.

05

TRADITIONAL TOOLS ARE STILL ESSENTIAL

As good and as intuitive as the sculpting tools are, there are times when you need more precision, but luckily, targets don’t just have to be edited by sculpting them. You can still work on them at a component level, meaning you can fine-tune the shape vertex by vertex should you need to. What also helps is the Soft Selection tool, which gives a gradual falloff from your selection, meaning the surrounding geometry is


Simplify your blend shape creation

06

07

08

09

influenced too, which can give a more organic feel.

06

SYMMETRICAL MODELLING

There are times when you need to influence both sides of the model at the same time, especially if you’re working on the mouth area. This is where symmetrical modelling comes in handy. If you’re working in component mode, all you need to do is hold down Control, Shift and right-click to bring up the marking menu. Go up to Symmetry, and then again go to Symmetry to enable it. The components affected will then be highlighted in blue. If you’re sculpting, you can use exactly the same key combination to get to the symmetry options.

07

INDEPENDENT SIDES

Another time-consuming task when creating blend shape targets is having to divide them.

Separate models Even though the Shape Editor allows you to work directly on the main model without the need for separate head models, you can still add target shapes, generated separately, to the blend shape node. Simply select the new targets, and then the main head model. In the Shape Editor go to Create>Add Selection As Target and those will be added to the blend shape node in the editor. They will likely have Shape added to the name, so just rename them, test them and you’re good to go.

For example, if you have a mouth shape where the character is smiling, you want that to be editable on both sides of the face, so the animator can raise the mouth on either side, independently. One method of achieving this is to use more heads and blend shapes, editing the weight values directly so the input from the main head is only affecting one side at a time. This will not only divide that shape, but any others you then feed into the main head shape, saving you time in the long run.

08 A SIMPLER APPROACH

With the Shape Editor this can also be simplified. You have two options. Create the full grin target shape, using symmetry to work on both sides at the same time, and then right-click it in the Shape Editor to bring up the menu. You can then duplicate the grin shape and use the Erase Target

3D WORLD April 2020

76

3dworld.creativebloq.com

tool to remove the sides you no longer need. Alternatively, you can create just one side of the grin and then duplicate it, but this time simply use the Flip tool to move those edits across, giving you both sides but on different targets.

09

BEWARE OF TARGET COMBINATIONS

One thing to always be aware of when creating your target shapes is how they will work when combined. Blend shapes are additive, meaning the vertex movement from each is simply added on to each other. If you’re working on the mouth area, for example, you need each target shape to work in harmony with all the others affecting the same area. So, if you have a target that raises the corners of the mouth and another which pulls them out to the side, if you’re not careful these, when combined, may result in a badly deformed model. •



3D BASICS

SIDEFX HOUDINI BASICS We continue our look at the core digital content creation applications; this issue, we explore SideFX Houdini

I

f you are new to CGI, there are far too many tools to choose from in a dizzying array of software. This series aims to break everything in CGI down to the very basics so that every artist can be armed with the knowledge of which tool is best. This month we look at Houdini. When experienced CGI artists are asked what 3D application can do everything, many will answer with Houdini. Still, they will likely argue that it is also the most intimidating 3D application due to its distinctive and sophisticated ‘node’-based approach to asset and scene creation. SideFX, the developers of Houdini, have realised that this complexity is also the reason that it is one of the most loved applications in CG creation. They have strived and have succeeded

in making Houdini one of the most accessible pieces of software in the CG industry. Houdini is embedded deeply within the VFX and game community as the FX problem solver of choice for fluid dynamics, destruction, complex motion graphics and character work Houdini utilises a nodebased workflow, which can be a culture shock for new artists. Still, when it is mastered, or at least understood, it means that every component in a scene is easily manipulatable. The other attraction of Houdini is the way that it can handle this data. Houdini is far faster at working with dense motion graphics scenes for example than other software. The great thing about Houdini’s node-based workflow is that 3D WORLD April 2020

78

if there are changes, they are easier to manage and implement than would be possible in other software, making Houdini a great sandbox for experimentation. The Academy Award-winning SideFX are constantly working with other leaders in the CG community to continually add new cutting-edge resources. With Houdini 18 this includes the new SideFX Solaris layout workflow, which is based on the open-source Universal Scene Description (USD) format developed by Pixar that has the potential to revolutionise artist workflows, from freelancers to major VFX studios. With a compelling price structure, every CG artist deserves to have a copy of Houdini on their machine. Let’s take a quick look at modelling and rendering in Houdini, as well as using Solaris.

3dworld.creativebloq.com

AUTHOR Mike Griggs Mike Griggs is a 3D and visual effects artist with vast experience across the industry, as both a creator and a technical writer. www.creativebloke.com


SideFX Houdini basics

01

MODEL IN HOUDINI

Artist Rohan Dalvi has created this box of cupcakes to show off some of the key elements of Houdini 18. The cake geometry is created using a variety of poly modelling tools that each result in nodes wired together into a procedural network. This network defines the flow of data and updates can be easily made on any of the nodes to tweak the final look. VDB volume tools are then used to bring together the parts into a single coherent cake surface. It is then possible to use the network to branch off and build the paper wrapping utilising the shape of the cupcake as a guide.

02

DETAILS AND VARIATIONS

The icing is created by sweeping geometry along a spiral, then using volumes to again create a coherent shape. This approach can be used to add different kinds of toppings with a dedicated copy to points node, which makes it easy to get variation in the layout of the pieces. It is possible to create a variety of different cupcake designs that all spring from the same base shape. This network can be converted into a single node called a Houdini Digital Asset, then that is turned into a cupcake generator with top-level parameters to control the results.

HOUDINI WORKS WELL WITH OTHERS SideFX have now made Houdini one of the most straightforward applications to get started with. Houdini Apprentice allows artists to learn Houdini without having to spend a penny. Freelancers can then move onto the Houdini Indie licence for the price of a couple of large coffees a month, which works with a variety of third-party renderers up to 4K resolution. SideFX also make Houdini Engine, which allows Houdini assets to be used with applications such as Maya, Cinema 4D, UE4 and Unity.

03

USE SOLARIS

The cupcake models are then brought into Solaris, which is a different kind of node network in Houdini dedicated to lookdev, layout and lighting. Here everything becomes part of a USD Scene Graph and can be used to prepare the model for layout. Randomisation can be added to make the cupcakes look less uniform. This involves the use of USD instancing combined with the control provided by the Houdini networks. One benefit of working in Solaris is that you have a dedicated layout and lighting environment backed up by the full proceduralism of Houdini. 3D WORLD April 2020

04

RENDERING

Once the layout is all set up, lights and cameras are added and can be manipulated using interactive artist-friendly tools, and then everything is available to render using Karma. Karma is a Hydra-compatible renderer, which means that it renders USD directly. Karma is currently in beta and is available as part of Houdini. Other Hydra-compatible renderers include RenderMan 23, Arnold, Redshift and ProRender, which can all be used with the freelancer-friendly Houdini Indie edition, which is available on Steam as well as directly from SideFX. •

79

3dworld.creativebloq.com


3D BOOTCAMP

GRAVITY SKETCH

In our second look at this VR modelling app, we explore how it has improved with the addition of subdivision modelling

S

ince we last delved into Gravity Sketch, the design and modelling tool for VR creatives, it has both gained ground in the VR creative space and become even more popular with artists around the world. However, a new feature has made Gravity Sketch the first application to warrant a second appearance in the Bootcamp series. Namely, the inclusion of a subdivision modelling workflow to complement the existing NURBS and spline workflow. While not every object that has been created in Gravity Sketch can be used to create a subdivision object, the available ones are a good start. The most basic is a

single polygonal plane, which can be extruded either along an edge or by polygon face to create more geometry. Edge and polygon loops can also be selected and extruded. Vertices snap to each other to create new geometry and seal holes. If symmetry is being used, vertices snap to the centre line. Just remember that symmetry needs to be on from the start of a modelling session for it to work. There is also a set of three new tools available for polygon modelling. These allow an artist to smooth out geometry, cut new edges and merge different objects into a single surface. These tools make it very straightforward 3D WORLD April 2020

80

to create basic characters and vehicles, and for many artists could provide a more straightforward workflow than initiating modelling in a standard 3D application. Subdivision smoothing is easily turned on and off to view a polygon mesh. And while selective colour cannot be applied to individual polygons as yet, colour can be added as with every other object in Gravity Sketch. The new subdivision toolset has elevated Gravity Sketch into a unique, friendly modelling tool that provides a fresh, intuitive and affordable way to get artists of any experience level to start modelling. Let’s explore what it can do.

3dworld.creativebloq.com

AUTHOR Mike Griggs Mike Griggs is a 3D and visual effects artist with vast experience across the industry, as both a creator and a technical writer. www.creativebloke.com


Gravity Sketch

01

START SKETCHING

03

WORK WITH A SUBDIVISION OBJECT

05

TOOLS FOR POLYGONS AND SUBDIVISIONS

While the subdivision and polygon modelling toolset is a great new feature, that does not mean the existing Gravity Sketch workflow should be abandoned. The sketch brushes should still be used to quickly form a quick 3D sketch that can allow the artist to get an idea of the volume very quickly. Keep the sketches to a single Gravity Sketch layer which can be hidden or shown at various transparency levels as desired through the modelling process.

04

EXTRUDE EDGES AND POLYGONS

06

ACTIVATE SUBDIVISIONS

While a couple of the pre-existing NURBS tools like surfaces can be converted to a polygon/subdivision surface, if it is a new scene, it is potentially best to start working with a single plane, with symmetry already selected. To do this, go to the Primitive menu and select Plane. Make sure the subdivision rounded cuboid icon in the top right of the primitive plane is selected. This greys out the primitive objects Gravity Sketch cannot use to make a subdivision object.

The workflow for subdivision objects is similar to that of manipulating any object in Gravity Sketch. Tap the top edit button on the secondary hand to switch into a view that allows the artist to manipulate points, edge and polygons. If symmetry has been turned on, vertices will snap to each other along the line of symmetry, creating a unified object and making initial modelling easier in Gravity Sketch than it is in many other 3D applications.

Extruding edges to create more geometry is simple in Gravity Sketch. Select an edge(s) or polygon(s), then a single click on the primary controller’s trigger pulls out a new edge or polygon. Vertices will snap to nearby points and seal edges. With the Auto Select Loops tool selected from the tool palette (visible on the secondary hand in edit mode), Gravity Sketch can select plausible edge loops and extrude a strip of polygons with one click of the trigger.

The subdivision toolset is accessed by clicking on the lower button of the primary hand, which has the icon of a brush and hammer when in editing mode. The sphere with the stick is a smoothing tool, which smooths out point position – remember to scale the scene in Gravity Sketch rather than the smooth tool itself. There is also a knife for cutting new edges into a polygon and a merge tool which can combine separate polygon objects in a single layer. 3D WORLD April 2020

02

CREATE A POLYGONAL OBJECT

In the edit palette toggle the Off/On button under Subdivision Level. There are three layers of subdivision smoothing available. If edges are too smooth, edge loops can be easily added by singleclicking the trigger on the primary hand on an edge, which will then make an edge loop if there is a plausible polygon loop available. With 'Auto Select Loops' selected, an artist can manually slide the new edge loop to make a harder corner. •

81

3dworld.creativebloq.com


Practical tips and tutorials from pro artists to improve your CG skills

Maya Jermy Maya is a 3D artist and animator based in the UK. She started her career in 2012 remaking and animating characters for Oddworld: Abe’s Oddysee - New ‘n’ Tasty. mayajermy.artstation.com

Oscar Juárez Oscar is a 3D generalist based in Mexico City. He has been running Fibrha Studio since 2010 and specialises in archviz rendering, animations and Unreal Engine. www.facebook.com/FibrhaStudio

Antony Ward Since the early 90s Antony has worked for many of today’s top game and VFX studios, and has written three technical manuals and many online tutorials. www.antcgi.com

SOFTWARE: ZBRUSH

WHAT’S THE QUICKEST WAY TO MODEL CHARACTER ACCESSORIES? Darren Reid, San Francisco

Maya Jermy replies

GET IN TOUCH EMAIL YOUR QUESTIONS TO rob.redman@futurenet.com

Every artist has their own favourite techniques, pipelines and tools. It takes a lot of practice to develop some kind of working pattern that suits one’s style, and with every project there is room for improvement or simply more practice. I am often asked about my step by step for creating designs, and the truth is it is quite basic. Whether I am making characters, accessories, or a piece of environment, I usually follow the same workflow: low poly -> high poly -> details 3D WORLD April 2020

82

-> retopo -> UVs -> textures and paint. If it is just concept art, I get to skip a few steps and render a beauty shot without having to unwrap or even retopologize my model. Between you and me, I prefer the creative part of the process over the technical. In an ideal world I would be happy not having to do any retopology at all. To assess the workflow, the main question should always be: what is the model for? Is it for a game, an animation, or is it supposed to show off our sculpting

3dworld.creativebloq.com


Your CG problems solved

EXPERT TIP

SILHOUETTE The silhouette is essential in creating a good-looking character, but can prove to be useful in the creation of accessories too. ZBrush 2020 has introduced the silhouette box in the top-left corner of the viewport. You can check your model’s silhouette at any point without having to switch materials.

STEP BY STEP MODEL A BELT IN ZBRUSH

01 Detailed accessories enhance characters’ appeal

or texturing skills? Based on that we can calculate what steps are necessary to complete the project. You do not really have to create UV maps for every piece you model, but it is a good practice. To create a simple accessory like a belt, there is not much work involved. For anything hard surface, I always choose to work with ZModeller. It gives me great control over polygons, edges and vertices. Let’s begin with utilising the standard geometry and adjust it to our needs.

LOW-POLY GEOMETRY

The easiest way to start any hardsurface low-poly model is to begin with the standard geometry, for example, a cube. If you choose the cube.zpr project, you an initialise its settings to create the base for the buckle. Go to Tool>Initialize, set X-4, Y-1, Z-4 and press the QCube.

03

HIGH-POLY DETAILS

Unless the specific details, shape and style for your model have been previously specified, I like to make use of the alphas found in ZBrush, or download them. You can easily create interesting patterns and a high level of detail with no need to actually paint them or mask them by hand.

3D WORLD April 2020

83

02

ZMODELER

04

CAVITY PAINTING

ZModeler is one of the best tools for hard-surface models. Right-click on one of the polygons, set the poly mode to Delete and remove the centre polygons as shown. Right-click an edge and set the edge mode to Bridge. It will allow you to connect edges and close off the mesh.

With the buckle selected go to Tool>Masking>Mask by Cavity. With the mask on, choose the light-grey colour to cover the unmasked area, Ctrl-click on the canvas to invert the mask and choose a dark-grey colour to paint the cavities. You can use the same techniques to model and paint the other parts of the belt.

3dworld.creativebloq.com


Your CG problems solved

SOFTWARE: UNREAL ENGINE 4

HOW CAN I CREATE PBR MATERIALS IN UE4?

FOLLOW THE VIDEO

Courtney Dennett, Birmingham

http://bit.ly/3DW-robots Oscar Juárez replies With the release of Unreal Engine 4 came a whole new way to think about and create our archviz projects. I will take you through a few easy steps for creating PBR materials in Unreal Engine so we can work with them as we usually do with other software like V-Ray and Corona.

01

TEXTURES AND NEW MATERIAL

The first step is to load our textures in Unreal Engine; we need to create a folder to store our files so that Unreal can access them. Once we have our textures we have to create our material and name it MASTER. Open it and hold and drag the

textures we will use. Once placed, change each one to Parameter (right-click, Convert to Parameter) and name each accordingly: Diffuse (Diffuse map), Roughness (AO map) and Normal (Normal map).

02

DIFFUSE

Next we need to add all the variables we need. We will go for Diffuse first, and we will add a variable that will enable us to change the size of the texture in the polygon. Press M and right-click, and MULTIPLY will show. Now press S and click so we can add a PARAMETER. Set its default value to 1 and name it UV SIZE. This is the one we will change. Press U,

3D WORLD April 2020

84

right-click and we will have TEXCOORD. Connect this to MULTIPLY in A, UV SIZE in B, and then connect the MULTIPLY to the Base Color. Apply and save the changes. Close the material and create an instanced material. Open it and you will see a value called UV SIZE, and that would be the value we set in the slate of the material so we can change the size. Now it’s time to add more elements so we can change the position in X and Y. Open the MASTER material. Press A and ADD will appear, then right-click and type APPEND VECTOR. Select it and press S so we can have a PARAMETER. Here is where we will change the values. Name it UV X, create

3dworld.creativebloq.com


Your CG problems solved

01

02

03

04

EXPERT TIP

START SIMPLE Do not try to have many complex materials in your scene while it is taking shape. Keep it simple and when the whole thing is done, start adding complexity. another and name it UV Y, set both default values to 1 and connect them to APPEND, X in A and Y in B. Connect APPEND to ADD in B and now we will connect the TEXCOORD we placed before in ADD A. Then we need to connect ADD to MULTIPLY A, apply, save and open the instanced material, and we will now have UV X and UV Y values. Check each one and you will be ready to change the values and see how it affects the material.

03

ROUGHNESS

Now we are going to add another option to our material, this time enabling us

to change the strength of the roughness. Press M and right-click to add a MULTIPLY, and click S and right-click to add a PARAMETER (set its default value to 1 and name it ROUGHNESS). Connect the ROUGHNESS TEXTURE to MULTI A, and connect the ROUGHNESS to MULTI B. Finally connect MULTI to the ROUGHNESS in our master material. Apply and save, and if you open the instanced material we will now have the roughness value available.

04

NORMAL

Next we need to tweak the Normal map. Press 3 and search for a MaterialExpressionConstant3Vector.

3D WORLD April 2020

85

Set its values to 0,0,1, now press S and click to add another PARAMETER. Set its default value to 1 and name it NORMAL STRENGTH. Press L, then right-click and we will have a LERP – this will help us reduce or increase the strength of our normal texture. Connect NORMAL STRENGTH to ALPHA, the MaterialExpressionConstant3Vector to A, and finally connect our normal texture to LERP B. Apply and save. Go to the instanced material and you’ll see how all the modifiers we need are available and are similar materials as those from other render engines. Now go ahead and change the different values to see how they work!

3dworld.creativebloq.com


Your CG problems solved

It’s frustrating when things are accidentally deleted in your scene, but can you prevent it from happening?

SOFTWARE: AUTODESK MAYA

IS THERE A WAY TO STOP SOMETHING BEING DELETED IN MAYA? Steven Pickering, Milton Keynes Antony Ward replies For a technical artist, as well as making your rigs more efficient and user-friendly, you also need to make sure they can’t be inadvertently broken by the animator. Unfortunately, there are lots of ways this can happen and with just an innocent pick walk up the hierarchy, a group or another node is then being animated and keyed rather than the intended control, which can cause all sorts of issues, especially when exporting to a game engine. Luckily, most issues can be easily avoided by simply taking the time to go through and lock any attributes that are off limits. It also helps to hide them too, just to be extra cautious. These actions do help to keep your rig tamper proof, but one thing they can’t do is prevent someone from simply selecting a part of the rig and deleting it. Now, an obvious part like an arm or a leg can be undone, but it’s those hidden nodes, the

important ones holding key data for the scene or the rig’s construction, that are easily mistaken for rubbish and deleted while cleaning the scene. So how do you prevent them from being removed? Like most things in Maya the answer isn’t found in the menus, instead it’s a simple piece of code:

cmds.lockNode(“nodeName”, lock=False) With this bit of Python you can lock the intended object. All you need to do is replace “nodeName” with the name of the object you wish to lock. For example:

cmds.lockNode(“arm_Control”, lock=False) Now if you try to delete it, you will get the following error:

// Error: file: C:/Program Files/Autodesk/

3D WORLD April 2020

86

Maya2019/scripts/others/doDelete.mel line 111: Cannot delete locked node ‘arm_Control’. // (If you want to reverse this and unlock something instead, just set the “lock” flag to “True”.) Your nodes will be nice and safe now and will resist any attempt to remove them.

EXPERT TIP

MEL ALTERNATIVE If you aren’t a keen Python user you can also use a similar bit of code with MEL, it just needs reformatting to something like this: lockNode -lock off “nodeName” – to unlock something, just change the -lock flag to on.

3dworld.creativebloq.com



News and views from around the international CG community

MEET THE ARTIST

Tim Doubleday The experienced product manager shares his insight into the world of visual effects and motion capture im Doubleday is the VFX product manager at Vicon, helping develop their motion capture tools such as Shogun. We chatted to him about his career journey, studio setup, and predictions for the future of the industry.

T

What got you into the industry? I feel lucky to have grown up alongside the introduction of the internet and PCs becoming available on a commercial scale. I remember the introduction of bulletin boards and the availability of early 3D rendering software like V-Ray and 3ds Max and being amazed at the possibilities that computer graphics could offer.

It wasn’t until a year of doing an Art Foundation that I realised how this emerging industry could not only be amazingly fun but also offer a potential career. This was further cemented while doing a three-year BA in Computer Visualisation & Animation at Bournemouth University. At the time there were only a couple of university courses and I delayed a year to do the Art Foundation to make sure I got a place. In hindsight this was the right decision as it introduced me to motion capture along with numerous other animation techniques. While we didn’t have access to a motion capture system at Bournemouth, it was starting to be used more in videogames like

3D WORLD April 2020

88

Virtua Fighter and films like The Matrix. It was actually one of my lecturers, John Vince, who introduced me to Vicon and I owe him so much because of this. What is your daily work life like? I’m actually in my third stint working at Vicon after gaining invaluable production experience at companies like The Imaginarium and Audiomotion Studios. Vicon is a great place to work, especially having grown up in Oxford. I get to cycle to work which is fantastic and Vicon has a weekly frisbee game which is a great way to split the week. Having done crunch and long days on set the more relaxed hours are a godsend, although I do miss those

3dworld.creativebloq.com


Tim Doubleday Avid gamer Doubleday has a workspace personalised with a variety of neat gaming memorabilia, particularly from the likes of the Legend Of Zelda series

Right: Vicon produce the mocap hardware as well as the software

days; being on a film set is an incredible feeling and boy do I miss the free snacks from Craft Services! While I have my fair share of meetings I try to offset these with product-focused days and try and organise mini shoots when possible. We also try and bring something new to trade shows like GDC and Siggraph so it’s exciting getting to collaborate on these projects. I’m also lucky in the fact that I get to visit customer sites and see the amazing work they are creating using our software. What’s your setup and what kind of software do you use? Since I travel a lot I use a 13-inch Surface Book the majority of the time: having a 3D WORLD April 2020

89

3dworld.creativebloq.com


Tim Doubleday

An intensive work schedule makes hitting the slopes all the more valuable

“I SEE MACHINE LEARNING HAVING AN EVEN BIGGER IMPACT ON THE MOCAP INDUSTRY” dedicated GPU means it can run all the 3D software, game engines and Shogun, which is the name of our motion capture software. I then have a desktop machine for any heavy lifting including project work for shows. I’ve recently upgraded this to an AMD Ryzen 9 3950 and an RTX 2070 which absolutely flies. Our current Digital Human project doesn’t make use of raytracing yet, but I hope that we can add it in some form in the future! I was trained in Autodesk Maya and then picked up Filmbox which became MotionBuilder. At the time MotionBuilder was the only 3D rendering software to offer anything close to real-time performance.

Now game engines have taken that further by adding support for complex shaders, lighting and even real-time raytracing! We support both Unity and Unreal in our motion capture pipeline but personally I’ve had more experience using Unreal. While I’m no expert I can at least set up animation blueprints and create interesting environments and characters for the motion capture to take place in. Motion capture is often seen as the enemy of keyframe animation, how do you feel about this? I can see both sides of this argument really. I think over the last 20 years as the VFX

3D WORLD April 2020

90

and game markets have exploded there has become a necessity to use motion capture as a means to deliver enough human-looking motion. This has left traditional animators feeling like work is being taken away from them. I can see their point but I think there is enough animation work to go round. There are also always things that you can’t motion capture like creatures and other non-human characters. Not to mention the huge range of games that have biped motion that needs to fit into a precise animation loop. The Souls games for example might use motion capture as a base but then be heavily keyframed to fit the gameplay so it feels tight and satisfying.

3dworld.creativebloq.com


Tim Doubleday

Doubleday is equally at home on stage or in the saddle

How do you see motion capture and keyframe animation working together? Having delivered final animation on a number of videogames including Battlefield V, I can definitely state that no matter how good your motion capture data is it’s always going to need an artist to add their touch. Whether it be face or body animation there are always going to be areas of the animation that don’t hit the exact look you are after. The role of a motion editor has become a well-known part of the motion capture pipeline. It requires both a technical understanding of how the retargeting process works along with the ability to massage the motion and get the look you are after. The process of going from motion capture data onto a specific character rig often includes manipulating weights and offsets, which is a real skill in itself.

Where do you see the mocap industry going in the next ten years? As motion capture has become more widely adopted the tools have had to become easier and quicker to use. A big part of this is the switch to real time on set and I see this only becoming more important in the future. Being able to visualise the motion capture shoot in real time within a game engine using close to final quality assets has been common in the last five years or so. This requires a lot of prep before the shoot and removes the idea of fixing things in post, although this can still take place if required. TV shows like The Mandalorian are using live camera tracking using optical motion capture to help remove the need for traditional film sets. By mixing live CG elements and backgrounds rendered across huge LED walls, the need for costly lighting crews and set dressing is greatly reduced.

3D WORLD April 2020

91

This allows productions to save money and work within the confines of TV budgets while delivering film-level quality VFX. I also see machine learning having an even bigger impact on the motion capture industry over the coming years. The deaging and facial capture work that ILM did on The Irishman is a great example of how machine learning can be used to train a facial rig based off thousands of photos and create much younger versions of the actors. This combined with removing the head-mounted camera (HMC) and tracking dots from the actor’s face feels like a real game changer for performance capture. Machine learning is also being used by Ubisoft in their videogames to help train motion models based off motion capture. This helps with delivering incredibly realistic movement to characters at runtime – as an avid gamer I love this technique!

3dworld.creativebloq.com


SuperAlloy created an action narrative project in VR to demonstrate their in-house talent and technology

A DAY IN THE LIFE

Lights, camera, action! Eric Jacobus gives 3D World the lowdown on SuperAlloy Interactive and his action-packed career so far NAME Eric Jacobus JOB TITLE Head of production, co-founder STUDIO SuperAlloy Interactive LOCATION USA ABOUT Eric Jacobus merges tech experience and stunt expertise with his new action design studio. WEB superalloyinteractive.com

ric Jacobus has been an actor, stuntman and action coordinator since 2001, as well as having a background in programming. He has gone on to have a varied career that includes producing, starring, stunt coordinating and directing. Jacobus also co-founded SuperAlloy Interactive, an action design studio that specialises in motion capture, combat design and previsualisation. He recently sat down with 3D World to discuss an action-packed day in his life.

E

What does a typical day at SuperAlloy involve for you? Every day, I’m up at 5am to read my Bible and see my kids when they wake up. Otherwise there is no typical day. For shoot days, I arrive

at the studio at 7am to set up any tech for previsualisation, motion capture or virtual production. Then at 9am we will begin eight hours of choreography, previs, mocap, falls, reactions, attack animations, navigation or cinematics. Some days it’s 12 hours of what I call ‘WEEC work’ – workout, email, editing, calls. Other days are travelling, consulting, trade shows, meetings. Can you describe your role as co-founder of SuperAlloy Interactive? I was in the world of stunts and filmmaking for decades, but my first job was actually as a PHP and Visual Basic programmer, so I’m like a techy-stuntman-filmmaker who can do action design using Unity or Maya. The market is shifting so quick, so if we want to innovate with action design

3D WORLD April 2020

92

3dworld.creativebloq.com

and choreography, we need to understand these tools. What’s the best thing about your role? One day I could be consulting for a German sci-fi cinematic, the next could be mocap on a Polish indie action game, and then wirework for 3D previs on an Indian film. It’s a stream of interesting projects. I have to bring the best talent for every job so I don’t miss a beat. The best part is I’m only working ten miles from my wife and kids. And my work allows me to keep the Shabbat (Sabbath) on Friday night, when I stop working for 24 hours. What kind of day-to-day challenges do you face? Mundane challenges like server problems, hardware issues, or internet outages are common


Eric Jacobus

DEVELOPING THE VISION Eric Jacobus delves further into the world of action sequences and stunts

Eric Jacobus at work, applying his stunt expertise to the motion capture process

Can you tell us about SuperAlloy? Action design is all in the process. In film, you can create great action or comedy, but someone else down the assembly line can then destroy it – a bland camera angle, a bad edit, a misplaced sound effect. So we not only create the action for the director’s vision, but we develop a process with the team to ensure the action vision survives through to the end. What elements are essential to a great video game action sequence? We can create great moves, cool choreography, insane weapons and huge production design, but if story isn’t driving the action scene at every moment, then it’s just movement for movement’s sake. If story drives the scene, like a Zatoichi fight or Sergio Leone gun battle, you can deliver an entire scene with a single move. Everything must support the story.

“WE CREATE A TRANSPARENT AND SIMPLISTIC INTERFACE BETWEEN TECH AND PHYSICAL PERFORMANCE” for any tech-based company. We push ourselves to innovate, like when we created a virtual camera system combining HTC Vive and Xsens motion capture, but the technology is often totally unpredictable. That’s the cost of innovation. We just have to constantly test and test to make sure these issues don’t pop up when we’re with clients. How do you work with clients to bridge the gap between action and game design? Our clients usually don’t live action the way we do. We open that window into the world of violence and combat in a digestible way. This lays the foundation for our client relations. We tailor our performers’ movements to best fit the needs of animators, or we’ll design action that will look the best for the cinematic director’s filmmaking vision. We create a transparent and simplistic interface between tech and physical performance.

What kind of software and technology do you use? We use Capsule, Trello, Slack, Dropbox, and Box to manage our workflow. We shoot mocap using Xsens MVN and bridge a HTC Vive puck-based vcam using Brekel, pipe it into MotionBuilder for the client to see the live rig, stream over Zoom using OBS, maybe deliver retargeted anims using Maya. I’ll prototype combat in Unreal, or edit a playblast in Unity’s Cinemachine, export videos into Adobe Premiere, and post online using WordPress. How does a typical day end for you? At the end of a shoot day we debrief, break down everything, de-marker and wash the suits and offload, before processing and compressing our footage or mocap. After work it’s dinner with my family, shower if I have time and sleep like a rock. 3D WORLD April 2020

93

What advice can you give to artists that want to follow in your footsteps? Look at steel. An iron worker saw the potential in combining iron with other elements like titanium and manganese. He alloyed iron with other elements and developed steel, which became an entire new economy. Humans don’t really create anything, we combine and innovate. That’s what comedy is: alloying two seemingly unrelated things together for the first time in a logical way that makes sense. Does motion capture change how you approach action and stunts? Motion capturing a game’s cinematic cutscene is like filmmaking. You might need to imagine where the camera is, and you can fix attack animations in post, so you can be ten feet from your opponent and still hit him. Obviously, though, if we don’t know where the camera is we can’t play to it. For in-game mocap, the camera is fixed and the entire moveset plays to that camera angle. Every piece of action is a single move or combo that has exact parameters. That precision makes it an awesome challenge.

3dworld.creativebloq.com


BACK ISSUES Missing an issue of 3D World? Fill the gaps in your collection today! ISSUE 257 MARCH 2020 MASTER CHARACTERS O We head inside Games Workshop, the masters behind the miniatures, for an in-depth look at their work processes O Explore the world of digital de-aging technology O Make your very own Pickett the Bowtruckle in Cinema 4D O Downloads Free video training, files and more!

ISSUE 256 FEBRUARY 2020 THE WETA ISSUE

ISSUE 255 JANUARY 2020 ACTION ANIMATION

ISSUE 254 CHRISTMAS 2019 TRANSFORM YOUR 2D TO 3D

ISSUE 253 DECEMBER 2019 EXPLORE CG ANIMATED SHORT FILMS

O 3D World teams up with Weta to deliver exclusive content from the incredible artists behind some of the most iconic effects in film O How to render like Pixar: Part 2 of our tutorial series O Inside Milk VFX Part 5: The art of compositing O Downloads Free video training,

O Explore the second instalment of Guillermo del Toro’s animated Tales Of Arcadia trilogy O Discover the secrets to bloodcurdling horror VFX O Part 1 of our new tutorial series on how to render like Pixar O Inside Milk VFX Part 4 O Downloads Free video training,

O Discover how our cover image was created with a step-by-step guide to translating 2D artwork to 3D O Find out how to successfully sell your 3D assets O Create stunning steampunk art O Behind the scenes of DreamWorks Animation’s Abominable O Downloads Free video training,

O Learn the essential ingredients for building your own animated short O Inside Milk VFX Part 3, this month delving into environment building O Discover the special effects behind space adventure Ad Astra O Sky Captain And The World Of Tomorrow’s impact on the industry O Downloads Free video training,

files and more!

files and more!

files and more!

files and more!

CATCH UP TODAY! Visit Apple Newsstand, Pocketmags and Zinio stores to download a back issue of 3D World to your tablet or computer. 3D WORLD April 2020

94

3dworld.creativebloq.com


ISSUE 250 SEPTEMBER 2019 250TH ISSUE ANNIVERSARY SPECIAL

ISSUE 249 AUGUST 2019 DETECTIVE PIKACHU

O Discover the revolutionary impact of virtual reality technology O The first part in a new series exploring Milk VFX’s work on major TV shows and films O Enhance your imagery with our expert compositing tips O Downloads Free video training,

O 250 expert tips and tricks in celebration of our big milestone O Artists from the likes of Pixar and Ziva take a look back at the growth of the industry over the last two decades O Discover the CG artistry behind Toy Story 4 O Downloads Free video training,

O Discover the VFX secrets behind adapting the Pokémon world for the big screen O ILM talk Avengers: Endgame O Explore some of TV’s biggest sci-fi shows and the CG work behind them O Go behind the scenes with the American Gods visual effects team O Downloads Free video training,

files and more!

files and more!

files and more!

files and more!

ISSUE 248 JULY 2019 DIGITAL SCULPTS: ZBRUSH TRICKS AND TECHNIQUES

ISSUE 247 JUNE 2019 THE VFX BEHIND DUMBO

ISSUE 246 MAY 2019 CONSTRUCT ARCHVIZ BUILDS

ISSUE 245 APRIL 2019 ALITA BATTLE ANGEL

O Learn from the pros how to enhance your sculpting skills O The incredible effects work behind film end credit sequences O How DNEG does… Part 6, the final instalment, discussing the studio’s approach to compositing O Downloads Free video training,

O The secrets behind the reimagining of this classic character O Discover what goes into creating game cinematics and trailers O How to build a stunning mechanical dragon in VR O How DNEG does… Part 5, on effects simulations O Downloads Free video training,

O Three leading experts explore the foundations of archviz success O Welcome To Marwen’s virtual cinematography gets uncovered O Create dripping fluid effects in X-Particles and Cinema 4D O James Lawley discusses his inspiring archviz creations O Downloads Free video training,

O Discover the incredible facial mocap secrets behind Alita O Artists from the world of sci-fi talk creating amazing space scenes O Find out what the future of 3D printing holds for many industries O How DNEG does… Part 4, on complex CG environments O Downloads Free video training,

files and more!

files and more!

files and more!

files and more!

ISSUE 252 NOVEMBER 2019 POWER UP YOUR CG SKILLS O Find out how artists can utilise social media to build their brand O The making of Stranger Things’ gruesome Mind Flayer O Scanline VFX talk Game Of Thrones’ destructive dragons O Inside Milk VFX Part 2, this issue focusing on character animation O Downloads Free video training,

ISSUE 251 OCTOBER 2019 THE STATE OF VR: THE LATEST TOOLS, TIPS AND TECHNIQUES

APPLE NEWSSTAND bit.ly/3dworld-app POCKETMAGS bit.ly/pocket3D ZINIO bit.ly/3dw _zinio 3D WORLD April 2020

95

3dworld.creativebloq.com


Lorna Burrows

PRO THOUGHTS

CES 2020: what’s the verdict for VR and AR? Lorna Burrows takes a quick look at some of the VR and AR announcements from the world’s biggest tech show headset design has long been pointed to as one of the reasons VR is yet to become ‘mainstream’. Often headsets can be heavy, unbalanced and the lenses don’t always match up with your eyes. Headset technology has definitely come on in leaps and bounds over the last few years, with a few key developments unveiled at CES 2020.

VR

EYE TRACKING Not only did Pico Interactive debut their own VR glasses prototype, they also showcased their Neo 2 Eye VR headset, which includes eye-tracking technology in partnership with a company called Tobii. The headset tracks eye movements – and can even do so without calibration – allowing the user to activate aspects in the VR world just by looking at them. What’s more, eye tracking makes foveated rendering possible – when only the part of the environment the user is looking at directly is rendered in high quality, cutting the processing power needed for high-end VR experiences.

STAYING IN CONTROL Pico Interactive’s Neo 2 controllers offer 6DOF using inside-out tracking to track both the headset and touch controllers. It works by combining the data from an electromagnet (EM) and IMU contained within the controller, which gives us a positionally tracked controller that doesn’t

suffer from occlusion. Nolo, meanwhile, exhibited a 6DOF controller that you can add to smartphones, PCs or headsets – essentially creating an accessory bundle that enables SteamVR games to benefit from motion tracking. It’s even tipped to work with 5G too.

BRAIN POWER NextMind, a Paris-based startup, revealed its Brain-Computer Interface (BCI) developer kit – essentially a noninvasive EEG that the user straps to their head. Eight electrodes detect brainwaves which they connect to the digital world, essentially enabling the user to control a VR environment with just their thoughts. It’s been used in the medical sector already, but this is the first time it is being seen in the immersive space. It’s pretty exciting stuff.

HAPTIC FEEDBACK The VR industry is constantly striving to improve the realism of immersive experiences. That’s why there’s been an uptake in 4D systems, and now haptic accessories that give the user physical sensations to match those in the VR world are growing in capability and popularity too. bHaptics introduced their Tactsuit line of wearable haptic accessories, including a vest that gives powerful haptic feedback to the entire torso! This is definitely something to keep an eye on, especially in the gaming and location-based entertainment sectors.

3D WORLD April 2020

96

AR GLASSES Augmented reality glasses haven’t had an easy ride so far – it’s been challenging to create something comfortable and wearable with accurate tracking. So it’s interesting to see the praise being heaped on Chinese company Nreal’s Light glasses. These glasses are tethered to an Android phone and have their own touchpad controller, microphones and speakers – and they track the user’s eye movements while 3D mapping the view in front of them, so that digital characters and objects can be seen to interact with the environment. While some called them ‘clunky’, the glasses gave a necessary boost to the possibilities for head-mounted AR.

A GENUINELY USEFUL TOOL While CES is always packed with exciting announcements, with lots of interesting ideas about where VR and AR will head in the future, one aspect of the show was clear. VR and AR wasn’t simply limited to product announcements – the technology was frequently used to showcase other technological products, from simulating flying taxis of the future to visualising the next 15 years of the automotive sector, and much more. This just goes to show that while VR and AR still have a lot of progress to make in terms of future technological developments, they have already proven their worth as a genuinely useful tool. Find out more at FYI weareimmersive.co.uk

3dworld.creativebloq.com



Franklin helped Christopher Nolan to depict a number of otherworldly environments for Interstellar

PASSING THE TORCH Paul Franklin reveals what excites him most about the future of the VFX industry

“The most exciting thing about VFX is that anything is possible,” Paul Franklin, creative director of DNEG, tells 3D World. “A few years ago we reached a point where pretty much any image a filmmaker can come up with, you can now create.” Exploring the creative possibilities presented by digital techniques excites Franklin and he relishes the prospect of the younger generation creating visuals he can’t even imagine. “Just as my current role didn’t exist when I started back in 1989, each generation will create and explore new possibilities,” he adds. Franklin points out that the next generation of visual effects artists is one that has grown up alongside computers and technology. “I didn’t grow up with computers, they weren’t accessible to kids in the 1970s,” he reflects, “you only saw them on Star Trek and Doctor Who. Now they’re everywhere.”

3D WORLD April 2020

98

3dworld.creativebloq.com


Dream a little bigger

INDUSTRY INSIGHT

Dream a little bigger DNEG’s creative director, Paul Franklin, tells 3D World about his journey from sci-fi fan to VFX legend aving worked in VFX since the Nineties, co-founded DNEG and won two Oscars for his work with cinematic auteur Christopher Nolan, Paul Franklin knows a thing or two about the discipline. 3D World caught up with him at BFX Festival 2019, to discuss his career so far, the relationship between art and science, as well as the future of the industry. “I was always interested in sciencefiction films, TV shows and comic books,” Franklin says, reflecting on his journey into VFX. It wasn’t until his studies in fine art, initially at the Cheshire School of Art and Design before switching to the Ruskin School of Art in Oxford, that the world of visual arts was opened up to him. “If I couldn’t be a real astronaut then the next best thing was building the worlds and environments that you see in films.” Initially working on short films, theatre and magazine design, Franklin became enthralled with the emerging art of VFX, which was on the rise in the late 1980s. “Computers were turbo charging the VFX business and it brought together everything I was interested in,” he explains, “which was expressing myself visually, telling stories, animation, art design, filmmaking and working with technology.” “Film was always my first love,” Franklin adds, “because of the way you can spend time telling a story and put the emphasis on that storytelling.” Franklin’s first shot at feature film VFX came with the 1995 thriller Hackers. Three years later he would

Images courtesy of DNEG © Warner Bros. Entertainment Inc. All rights reserved.

H

join forces with a group of fellow VFX professionals to form DNEG, the renowned VFX and animation studio, born from a shared desire to work on feature film effects. The team seized an opportunity to create VFX for Pitch Black, the 2000 film starring a then unknown Vin Diesel, and built their company around that. “We had already worked together for a few years at that point, so we all knew each other,” Franklin adds. “We hoped we might last a couple of years, just to get the movie done and maybe do another project, but here we all are 21 and a half years later.” These days Franklin fulfils the role of creative director at DNEG, which sees him working across all of the studio’s projects. “I provide expertise and experience from my career, as well as working to design a creative response for our clients,” he explains. Franklin helps filmmakers to consider how they might tell their story visually, what ideas they could bring into play and what the right balance of VFX would be. “A lot of filmmakers out there are quite new to this and techniques are changing all the time. We’re always working to come up with new ideas for them, which allows us to be more innovative.” Franklin has utilised this innovative approach to VFX on several films with director Christopher Nolan, a filmmaker often revered for his commitment to incamera effects. “Chris is a filmmaker who likes to put reality in front of the camera,” Franklin reflects, “objective reality that you can actually film.” This involves everything

3D WORLD April 2020

99

from elaborate sets to daring stunts and dramatic pyrotechnics. “He knows so much about filmmaking techniques and he’s an expert in every area of the process,” Franklin continues. “The role of visual effects in his filmmaking is to extend his reach, to do the things that you can’t do for real, but they should always reference reality.” Most of the time this involves shooting something in-camera and then building on top of it with VFX. “It could be a shoot in downtown Chicago that we turn into Gotham City,” he adds, “or a stuntman being yanked on a wire becoming the extraordinary moment where Batman is pulled away by a train.” For his 2014 sci-fi epic Interstellar, Nolan and his crew headed to the harsh surroundings of Iceland to shoot on glaciers that could portray the unforgiving terrain of an alien world. “We wanted that harshness to appear on screen,” says Franklin, “and the way to do that is to go somewhere that offers you that. Then the visual effects team extend the environment and turn it into this extraordinary world you see in the film.” DNEG’s work on Interstellar received widespread acclaim for its scientific accuracy, but Franklin maintains that it’s important to never lose sight of the story. “If you have an appreciation for how things work then you’re much more in control and can tell your story more effectively. Physics simulations and exploring scientific theories are very much a part of VFX, but we should never lose sight of storytelling and allow technology to take over.”

3dworld.creativebloq.com


INDUSTRY INSIGHT

Animating in real time How e.d. films are pushing the boundaries of real-time animation thanks to an Epic MegaGrant hether creating short films or their own unique tools, e.d. films are pioneers in the realm of real-time animation. Recently they announced a slate of three projects, made possible by an Epic MegaGrant. The grant is part of a $100 million initiative designed to support creators innovating with Unreal Engine. “Being one of the first animation companies to receive a grant is a big deal for us,” says Emily Paige, president of e.d. films.

W

“As a boutique studio in an indie market of Montreal, Canada, that specialises in cultural and educational projects, we do not naturally have the glam that it takes to get eyeballs on our work.” The grant has been awarded in recognition of e.d.’s research into realtime animation since 2015, as well as the hundreds of workflow and animation tutorials that have been shared online since 2008. “It helps offset some of the risks we took,” Paige continues, “on a company and

3D WORLD April 2020

100

on a project level, in order to adventure into the unknown and unproven areas of art, technology, production and creation. We want to make storytelling easier for ourselves and for others like us, who are passionate about the craft.” A combination of experimental and proven techniques will be used by the studio across their upcoming slate, which consists of two short films and a VR experience. “With the grant came technical support and hands-on attention from Unreal which

3dworld.creativebloq.com


Animating in real time

Hairy Hill director Daniel Gies is known for pushing the limits of animation through customised scripting and creative programming

A teaser trailer for Hairy Hill can be found at edfilms.net/en/studio/ home-was-hairy-hill

allowed us to confidently enter into a space of experimentation and development,” Paige explains. The studio were able to quickly assess looks and techniques, tweaking them in real time. Internal research and development processes were adapted, applying all the knowledge from their years of researching game engines and production tools. Upcoming short Hairy Hill will explore a rural family’s relationship with nature and tragedy through an artful blend

of traditional 2D, paper puppetry and performance-driven animation. To make the short, e.d. will employ a mix of custom tools and features native to Unreal in a largely Maya-centric workflow. “Since Hairy Hill uses 3D characters in dense 2.5D, hand-painted backgrounds, it was imperative that we expand on our PSD to 3D plugin for Maya,” says Paige. “With hundreds of layers in every painting, it became clear we needed to rethink the PSD to 3D tool workflow.” Ordinarily the plugin

3D WORLD April 2020

101

would create a unique texture for every layer of a Photoshop image; when this proved too heavy for e.d.’s game engine workflow, they added a feature to distribute the Photoshop layers into a single or multiple 2K, 4K or 8K PNG. This ultimately reduces the number of textures and materials generated for a scene. “This feature will be coming to the next release of the plugin, and it is already saving us countless hours,” adds Paige. As the studio dug deeper into Unreal Engine, they were able to make use

3dworld.creativebloq.com


Animating in real time

Three Trees is directed by Aaron Hong, with M.R. Horhager responsible for writing and co-directing

Traditional 2D art direction combines with Unreal Engine’s capabilities to create Three Trees’ unique visual style

3D WORLD April 2020

102

3dworld.creativebloq.com


Animating in real time

e.d. films’ teaser poster for their ambitious and rapidly evolving VR experience The Garden

of its advanced instancing tools for painting assets onto terrain and procedurally populating the film’s different environments. The time they saved could then be redirected into polishing and perfecting each shot. Everything that e.d. learnt about Unreal Engine on production of Hairy Hill was utilised on Three Trees, a children’s story about three trees finding friendship and discovering their place in the world. The story takes place in a stylised forest, the specific and illustrative aesthetic of which proved a challenge for the studio. “Initially we had tried painting a collection of textures to wrap around the trees,” Paige recalls, “but the repetition and tiling became instantly apparent once the scene was populated. So we found a way to generate the textures almost entirely in Unreal procedurally, using only a few hand-painted elements that could be used together in a variety of ways to drive the texture generation.”

“WE WANT TO MAKE STORYTELLING EASIER FOR OURSELVES AND FOR OTHERS LIKE US, WHO ARE PASSIONATE ABOUT THE CRAFT” Real-time capabilities let e.d. avoid the element of guesswork that occasionally comes with animated productions, and were able to answer visual questions on the fly. “For any uncertainty that remained, we could bring the director in, sit with them and adjust the shot at full visual fidelity in real time,” Paige continues. “That became one of the best parts of the project. The director was able to sit with us and direct an animated short as though it were a liveaction set.” Musical VR project The Garden is headed up by David Barlow-Krelina, an integral member of e.d.’s art and technology team. “The Garden is ambitious,” admits Paige,

3D WORLD April 2020

103

“but at the same time it was born out of David’s early research and development in Unreal Engine, in particular, experiments using audio waveforms to trigger timeline events and animate parameters on rigs and materials. It is a project that is evolving and blossoming in a curiously synchronistic and altogether enticing way.” Even with three ambitious projects on the go, e.d. remain full of ideas and innovations. “The grant won’t allow us to achieve every vision we have,” says Paige, “but it is helping us to prioritise and focus on our own projects, in ways that we wouldn’t have otherwise as a small studio with a big imagination.”

3dworld.creativebloq.com


Above: A still from Together, in which Dell Blue teamed up with Google to introduce the Latitude 5300 Right: 3D techniques and motion graphics combine to advertise the OptiPlex 7070 in The Unexpected PC

3D WORLD April 2020

104

3dworld.creativebloq.com


Into the blue

INDUSTRY INSIGHT

Into the blue Dell Blue teamed up with The Mill to create a series of eye-catching adverts that captured the essence of their new computers

focused campaigns and content geared ell’s internal creative agency, at Dell OptiPlex’s desk-centric users. Dell Blue, recently created OptiPlex 7071 had a different, highly ad campaigns for three of targeted audience in mind: educators the computing giant’s latest and professionals who need a powerful products: the OptiPlex machine to create immersive virtual reality 7071 Tower, a premium business desktop designed for maximum VR performance; the as a presentation and teaching tool. Thus, the campaign called for a more robust OptiPlex 7070, a new form factor providing messaging strategy. a fully modular desktop experience; and the VR Core is Dell Blue and The Mill’s 5300/5400 Dell Latitude 2-in-1 with Chrome spellbinding visualisation of what’s virtually Enterprise OS. Dell Blue partnered with possible with the OptiPlex 7071. It depicts award-winning VFX company The Mill to the tower bursting open to reveal its produce captivating CG spots for campaigns intricate, high-tech inner workings hovering that span broadcast, online and social. in 3D space. The scattered components “The OptiPlex and Latitude product videos we worked on with The Mill were the then reassemble into a series of icons representing the specific industries that the first of their kind,” says Michelle Gillespie, 7071 model was designed for – healthcare, senior manager, commercial product launch education and retail – before converging at Dell. “Dell Technologies introduced a back into the tower. new form factor in the OptiPlex line and a “The VR Core spot new partnership with was by far the most Google in the Latitude challenging of the line featuring Chrome campaign, yet the most OS. Capturing the creatively rewarding essence of these new for our team,” says pieces of technology Davis. “We not only was essential to their had to present the introduction into the 7071 as an attractive marketplace. The Mill choice but also show and Dell Blue did this Joel Davis, executive creative director, Dell our target audience the flawlessly through scale of components their CGI executions, that make it powerful bringing the concept enough to run immersive VR experiences. boards to life.” Through a combination of visually For the OptiPlex 7070 Dell Blue crafted The Unexpected PC, an ad that illustrates the sophisticated and conceptual animations, we arrived at a solution that shows exactly 7070’s billing as the world’s most flexible what the product is and who it’s for, while commercial desktop solution. The scene bringing it to life within the Dell voice and uses a combination of 3D animation and style of previous campaigns.” illustrative motion graphics to outline the Dell Blue commissioned The Mill to product design and modularity that makes execute the VFX portion of the production. the 7070 the perfect choice for professionals “With a simple idea, we knew the visual who want to create a custom workspace execution of this would be the key to with one device. The 3D simulations were succeeding creatively,” concludes Davis. dialled up to introduce the Latitude 5300, “Finding a partner like The Mill to execute the world’s first and most powerful twohighly technical 3D simulations while in-one Chromebook Enterprise, created in making them look premium was paramount. partnership with Google. Their expertise and artistry really show According to executive creative director in this campaign and we enjoyed our Joel Davis, for most OptiPlex product collaboration with them.” launches, his team executes productivity-

D

“THROUGH A COMBINATION OF VISUALLY SOPHISTICATED AND CONCEPTUAL ANIMATIONS, WE ARRIVED AT A SOLUTION”

3D WORLD April 2020

105

3dworld.creativebloq.com



Open-source 3D for illustrators

INDUSTRY INSIGHT

Open-source 3D for illustrators Jumping into professional-level 3D illustration that will cost you… nothing! oday’s open-source software (OSS) isn’t your Dad’s. Many of today’s top open-source programs have caught up and are every bit as good as the commercial-ware they compete with. Their main cost is the time it takes to learn them, as they can sometimes be a bit more idiosyncratic than commercial tools. But even that’s getting better. Plus, all but one package discussed here is available on all major platforms, Mac, Windows and Linux. As an illustrator, keeping costs down is important. OSS tools don’t have purchase or upgrade costs. They won’t hit you with monthly subscriptions, and their subscription won’t expire in the middle of a major project. Plus, you can load them onto as many computers as you like. And OSS means you don’t have to worry quite as much about software choices. Even if your client’s in-house team works on LightWave, you can still hand them your Blender projects knowing they can easily download Blender and have full access.

T

AN ILLUSTRATOR’S 3D WORKFLOW Creating 3D art can be a hugely timeconsuming process, so the trick is to use each tool to its best advantage. Use your 2D

paint program as much as possible, and limit 3D use to where its use will shine. The first step in understanding 3D illustration is that you rarely need to get perfect renderings out of your 3D software. We all know how seductive the idea of the perfect rendering is, but let it go, trust us. Post-production is your friend, and you can fine-tune your images later in your paint program, often with little effort. You also don’t need to get all of an image rendered in just one pass. Compositing the final illustration in your choice of 2D compositing software (i.e. your paint program) will save you hours and days of wasted time. Yes, breaking all the elements down into separate renderings will add a bit of extra work up-front, but it’s sure to save you time later on when changes and deadlines loom near. In fact, once in your compositor/paint program of choice, you can even have elements coming from many different 3D programs. Why do this? Because while there are many general-purpose 3D applications available, there are also many specialised ones that are designed to let you produce something very specific easier, better or faster than you otherwise could do. More on this in a bit.

FROM PHOTOSHOP TO BLENDER While not open source or freely distributed, Photoshop is ubiquitous in the graphics world, and beyond. But many Blender users don’t know that you can import a PSD file directly into Blender, if you take a few simple steps first

A scene created by Jonatan Mercado with Blender 2.8 and real-time renderer Eevee

Blender only supports an older version of Photoshop. Rather than looking for a copy of PS 1.0, all you have to do is go into PS preferences, and open the File Handling section. You will see the Maximize PSD and PSB File Compatibility options, which include Never, Always and Ask. When a file is saved with maximum compatibility, it is not only much easier to hand off to team members on older versions, but it also makes it importable into Blender. Reasons to set it to Always! Unlike Maya’s sophisticated PSD import/export with layer support, Blender will import the file flattened. But at least this saves us from creating multiple file copies and bloating our storage.

3D WORLD April 2020

107

3dworld.creativebloq.com


Open-source 3D for illustrators

There are few programs that rise above the discussion of commercial vs. open source as well as Blender (blender.org) is able to. It has always been a great option, but in recent years it has become even more capable, rivaling most commercial applications. And with version 2.8’s improvements, its user interface is even more accessible. Blender’s toolset includes the Grease Pencil, which allows you to draw in 3D space, its new Eevee renderer that allows real-time photorealistic rendering, a builtin compositor, and an endless offering of special visual effects tools – one would be hard-pressed to come up with a wish list of anything missing. The only real complaint from an illustrator’s perspective would be that since it is really designed for animators, there is a huge amount there that needs to be ignored, and can make learning a steeper curve than you may want. But this is no different than using Maya or other top 3D programs for illustration. And unfortunately, it’s difficult to recommend a runner-up that comes anywhere close. But there are plenty of free learning tools and videos online, so basic functionality should not take long, and you can add more as you move forward.

GIMP VS. KRITA Going open source means forgoing the wonderful world of Adobe. And for many, the hardest part would be the loss of Photoshop, which is certainly hard to replace. But the two programs that may come the closest are the venerable GIMP (gimp.org), and the younger player, Krita (krita.org). GIMP has been around a long time now, and while it has always lagged behind Photoshop, it is a capable alternative. It’s often described as Photoshop, a few versions back. That’s fairly accurate, except for the interface. Its UI continues to drag it down, and takes some getting used to. Want your pressure-sensitive pen to work? You need to dig down three levels and find the settings. Working on a 4K monitor? Some of the UI is

© Blender Foundation cloud.blender.org

THE GORILLA IN THE ROOM

BLENDER’S NEW GREASE PENCIL & ALL-IN-ONE WORKFLOW Draw 2D in a 3D viewport With the release of more new and refined tools in Blender, like Grease Pencil (enabling artists to draw in the 3D space) and compositing, the door further opens to creating 2D art, storyboards, and other content from within Blender, skipping the need for other tools altogether.

still coughing a bit. Nonetheless, the toolsets are very strong. By comparison, Krita has a lovely interface that anyone used to Photoshop will feel instantly at home with. Its toolsets are mostly self-explanatory, with one exception. As you click on various tools you may be looking for the deep settings you expect to find, but most are not there. Krita is a lot less deep than either Photoshop or GIMP. Where Krita shines is in its ease of actually creating art. For example, a simple right-click on the canvas brings up a wonderful painting tool selector, complete with a smattering of other utilities. Oh, and your pressure-sensitive pen will work right out of the box!

SPECIALITY 3D PROGRAMS There is little in the world of 3D that a program like Blender can’t do. But sometimes, there are other programs that can do speciality projects faster, easier or even better. For example, you could create an indoor architectural scene in most 3D software, but the open-source program Sweet Home 3D (sweethome3d.com) is a one-trick pony built for doing just that. Or, perhaps you need to create humans for an architectural previsual? MakeHuman (makehumancommunity.org) is an open-source Poser-like program for the prototyping of photorealistic humanoids. In both cases, these characters and interiors can be rendered and then brought

Right: Arguably the top two contenders for Photoshop replacement in the open-source arena. Of these two 2D paint programs, GIMP is far more dense. Krita however offers great interface features like its easy-access paint tool controls 3D WORLD April 2020

108

3dworld.creativebloq.com


With its new Grease Pencil toolset, users can now draw directly into the 3D scene, or in 2D projects, and create amazing new sketch-like imagery © Blender Foundation cloud.blender.org

Left: Need to populate scenes created in Sweet Home 3D? You can download MakeHuman for free and use it as you might Poser or one of the other commercial apps

into your paint program for compositing. Or in some cases, the 3D models can be exported and imported into your main 3D program and rendered there.

WRAP UP While there are alternatives, the heavyhitters for open-source 3D illustration work are undeniably Blender for 3D work, and GIMP or Krita for 2D illustration, and compositing of the 3D rendered elements. Let’s give a mention to Affinity Photo (affinity.serif.com/en-us/photo) as well. While not free or OSS, it’s a very popular, capable and affordable alternative for Mac and Windows. The speciality apps mentioned here are a couple of popular and project-specific

“ONE WOULD BE HARD-PRESSED TO COME UP WITH A WISH LIST OF ANYTHING MISSING [FROM BLENDER’S TOOLSET]” standouts. But they are hardly the only options, as the list of alternatives is large, and always shifting. Which means just like commercial apps, OSS projects pop up, show promise, and sometimes evaporate. So it’s always safer to invest your learning curve hours into programs that have a long-term established development history. The last item we have to mention is Ubuntu Studio (ubuntustudio.org). This 3D WORLD April 2020

109

is a Linux distribution based on the core Ubuntu release, but customised to include most software that an art and media studio would need to run a daily operation, all preinstalled with the OS. All of it OSS, and free, so you don’t even need to spend on MacOS or Windows, and it’s designed to be especially easy to use. Under the graphics category it automatically installs Blender, GIMP, and vector-based Inkscape.

3dworld.creativebloq.com


We explore the latest software and hardware tools to see if they are worth your time or money

AUTHOR PROFILE Cirstyn Bech-Yagher Cirstyn has moved from Radeon’s ProRender to the RizomUV team, where she does product management as well as modelling, UV mapping and tutorial writing. Cirstyn.com

SOFTWARE REVIEW

FEATURES New nodes: Island, Draw, Infinite, Cracks, Bomber, Crater

PRICE Indie: $99 / Professional: $199 / Enterprise: $299

| COMPANY QuadSpinner | WEBSITE quadspinner.com

Incredibly accessible UI Blisteringly fast (complex) terrain generation Framework for integration into other pipeline applications New render engine and beta light baking

uadSpinner’s Gaea has matured since we reviewed it last February, and has regularly released new features and updates. A Windows 7 or 10 exclusive, CPU and node-based application, it is aimed at anyone interested in generating terrains, from hobbyist newbies to indies to enterprises and studios. On release, they talked a big game about their dedication to creating tools and technologies to provide artists with both speed and realism in terrain generation. And do they still walk that talk? Absolutely. To me, on a terrain generation field where Gaea, World Creator, and a lagging World Machine are the ones left standing, Gaea offers the most extensive set of

Q

3D WORLD April 2020

features and flexibility by far. One of the first standouts when using Gaea last year was its inviting UI, ease of use and creation with its layer or node-based workflows. This is still the case. The speed of terrain generation and output is impressive, especially now they have added a memory optimisation option for working with big graphs, where you can either clear the cache or optimise and compact it when you notice Gaea slowing down. Gaea implemented many features and updates last year, like the Island node, allowing users to generate the outline of continents and islands as well as draw it. Together with the rewritten Bomber node and new Crater node, it’s been a snap creating islands and

110

3dworld.creativebloq.com

continents with calderas and warlike wear and tear. Gaea also added some much-needed texturing muscle, which brings me to my second standout: the Texture and Vegetation nodes. Combined with the added splat map as well as the improved Snow node, they make rendering and texturing in or outside of Gaea, like in Quixel’s Mixer, so much easier. Where the Texture node lets you create advanced texture masks for colouring, the Vegetation node complements it with its ability to let you generate foliage overlays on any of your colour texture maps. The Texture node lets you use colour nodes or external colour inputs like SatMaps. This makes the


QuadSpinner Gaea

Main: Gaea is fast becoming an industry staple, as illustrated by Aron Kamolz in Clarisse Left: One of the standouts currently in Gaea is the ability to draw your terrains, allowing you to place and tweak them exactly how and where you need Below: Creating a large variety of terrains, from mountains and craters to dunes and volcanoes, is impressively easy

colouring workflow significantly easier, as it gets rid of having to combine multiple maps for masking. Which I appreciate, as it tended to get cluttery in that particular node-ville at times. Combined with the Vegetation node, you can now go to town on your detailing and colouring, as the node lets you add the nuances needed to mimic vegetation. The third standout for me is not the new infinity graph, the new renderer and its still-in-beta light baking capabilities, nor the inclusion into the Houdini integration club and its subsequent bridge framework. It’s the improved Wacom support and the ability to generate terrains by drawing their outlines. Both the Draw and Island nodes

“GAEA IS BOTH FUN AND EFFICIENT TO USE, THE WORKFLOW IS HIGHLY ACCESSIBLE AND BLISTERINGLY FAST” have this capability. They look deceptively simplistic, but they can assist in creating highly sophisticated output. The ability to outline your terrains like this and then plug them into all the features available makes Gaea both fun and efficient to use, as the workflow is highly accessible and blisteringly fast. I wish I had something negative to say about Gaea, but there isn’t much: it would be great if Team QuadSpinner 3D WORLD April 2020

had time for more tutorials, and the Build Swarm ‘ding’ sounds are somewhat annoying; they drowned out Spotify on complex terrain graphs. I also had some issues with the floating 2D viewport obscuring things, but nothing significant cramped my workflow. To be honest, I wish I had more time to spend creating in Gaea. If that doesn’t say it all, I don’t know what would.

VERDICT

111

3dworld.creativebloq.com

ANOTHER POWERHOUSE The Erosion Editor is also worth mentioning, as it’s a fast and efficient way to finalise your terrains. When you’re done with your terrain, save it as a compatible bitmap, and send it to the Erosion Editor. The Erosion Tools and their Properties enable you to pretty much sculpt anything with your brush, ranging from soil and rocks to flowlines and erosive breakdowns, allowing for a very natural look and workflow.


Shattered glass in Cinema 4D

PARALYZED Software Cinema 4D, Redshift, Marvelous Designer, ZBrush, Photoshop, DAZ Studio Year made 2019

3D WORLD April 2020

112

3dworld.creativebloq.com


Shattered glass in Cinema 4D

Incredible 3D artists take us

behind their artwork

Jason Knight artstation.com/knight Jason Knight is a freelance digital artist with nearly four decades of experience. He started in 1982 with an award for art created using a Timex Sinclair 1000. SHATTERED GLASS IN CINEMA 4D As soon as the idea came to me, I knew that I wanted to have broken glass. I started by drawing a glass break pattern in Photoshop. I used Cinema 4D release 17, but in newer versions Voronoi Fracture also works great. I created a vectorizer and applied the image as the texture in the vectorizer properties, then I set the tolerance to 1 and the angle to 90 degrees to make the edges sharper. I dropped the vectorizer into an extrude object and set appropriate thickness. Finally, I added modifiers to achieve the right shape and movement.

3D WORLD April 2020

113

3dworld.creativebloq.com


IN THE VAULT

FREE RESOURCES Follow the link to download your free files http://bit.ly/3DW-robots MODELS AND TEXTURES

EVERY MONTH Download these models and textures to use in your own projects

Future PLC Richmond House, 33 Richmond Hill, Bournemouth, Dorset, BH2 6EZ

Editorial Editor Rob Redman

rob.redman@futurenet.com

Designer Ryan Wells Production Editor Rachel Terzian Staff Writer Brad Thorne Group Editor in Chief Claire Howlett Senior Art Editor Will Shum Contributors David Domingo Jiménez, Steve Talkowski, Garreth Jackson, Jarlan Perez, Ian Failes, Trevor Hogg, Rusty Hazelden, Mike Griggs, Mohammadreza Mohseni, Pietro Chiovaro, Antony Ward, Maya Jermy, Oscar Juárez, Tim Doubleday, Eric Jacobus, Lorna Burrows, Lance Evans, Cirstyn Bech-Yagher, Paul Franklin Creative Bloq Editor Kerrie Hughes Advertising Media packs are available on request !ǝǣƺǔ «ƺɮƺȇɖƺ ǔˡƬƺȸ Zack Sullivan UK Commercial Sales Director Clare Dove Advertising Sales Manager Mike Pyatt Account Sales Director Matt Bailey International Licensing 3D World is available for licensing. Contact the Licensing team to discuss partnership opportunities. Head of Print Licensing Rachel Shaw Licensing@futurenet.com Subscriptions Email enquiries contact@myfavouritemagazines.co.uk UK orderline & enquiries 0344 848 2852 Overseas order line and enquiries +44 (0) 344 848 2852 Online orders & enquiries www.myfavouritemagazines.co.uk Group Marketing Director, Magazines & Memberships Sharon Todd Circulation Head of Newstrade Tim Mathers Production Head of Production Mark Constance Production Project Manager Clare Scott Advertising Production Manager Joanne Crosby Digital Editions Controller Jason Hudson Production Manager Nola Cokely Management Managing Director - Prosumer Keith Walker !ǝǣƺǔ ȵƺȸƏɎǣȇǕ ǔˡƬƺȸ Aaron Asadi Commercial Finance Director Dan Jotcham Global Content Director Paul Newman Head of Art & Design Greg Whitaker

MODELS + TEXTURES

Printed by William Gibbons & Sons Ltd, 26 Planetary Road, Willenhall, West Midlands, WV13 3XB

FREE ASSET DOWNLOADS Download a set of fantastic textures from and build your own asset library with these incredible models from in our monthly model and texture giveaway

Distributed by Marketforce, 5 Churchill Place, Canary Wharf, London, E14 5HU www.marketforce.co.uk Tel: 0203 787 9001 ISSN 1470-4382 We are committed to only using magazine paper which is derived from ȸƺɀȵȒȇɀǣƫǼɵ ȅƏȇƏǕƺƳً ƬƺȸɎǣˡƺƳ ǔȒȸƺɀɎȸɵ ƏȇƳ ƬǝǼȒȸǣȇƺ‫ٮ‬ǔȸƺƺ ȅƏȇɖǔƏƬɎɖȸƺِ The paper in this magazine was sourced and produced from sustainable managed forests, conforming to strict environmental and socioeconomic standards. The manufacturing paper mill holds full FSC (Forest Stewardship !ȒɖȇƬǣǼ٣ ƬƺȸɎǣˡƬƏɎǣȒȇ ƏȇƳ ƏƬƬȸƺƳǣɎƏɎǣȒȇ All contents © 2019 Future Publishing Limited or published under licence. All rights reserved. No part of this magazine may be used, stored, transmitted or reproduced in any way without the prior written permission of the publisher. Future Publishing Limited (company number 2008885) is registered in 0ȇǕǼƏȇƳ ƏȇƳ áƏǼƺɀِ «ƺǕǣɀɎƺȸƺƳ ȒǔˡƬƺ‫ ي‬ªɖƏɵ RȒɖɀƺً Áǝƺ ȅƫɖȸɵً ƏɎǝ ‫ ׏‬ 1UA. All information contained in this publication is for information only and is, as far as we are aware, correct at the time of going to press. Future cannot accept any responsibility for errors or inaccuracies in such information. You are advised to contact manufacturers and retailers directly with regard to the price of products/services referred to in this publication. Apps and websites mentioned in this publication are not under our control. We are not responsible for their contents or any other changes or updates to them. Áǝǣɀ ȅƏǕƏɿǣȇƺ ǣɀ ǔɖǼǼɵ ǣȇƳƺȵƺȇƳƺȇɎ ƏȇƳ ȇȒɎ ƏǔˡǼǣƏɎƺƳ ǣȇ Əȇɵ ɯƏɵ ɯǣɎǝ Ɏǝƺ companies mentioned herein.

ASSETS + SCREEN GRABS + VIDEO

SCREEN GRABS + VIDEO

RENDER LIKE PIXAR

UNREAL ENGINE 4

Part 3 in our tutorial series from Rusty Hazelden on harnessing the power of RenderMan

Follow along with Oscar Juárez’s video on creating PBR materials in Unreal Engine 4

ISSUE 259

NEXT MONTH Masterchef CG: Create stunning food-based still-life renders

If you submit material to us, you warrant that you own the material and/ or have the necessary rights/permissions to supply the material and you automatically grant Future and its licensees a licence to publish your submission in whole or in part in any/all issues and/or editions of publications, in any format published worldwide and on associated websites, social media channels and associated products. Any material you submit is sent at your own risk and, although every care is taken, neither Future nor its employees, agents, subcontractors or licensees shall be liable for loss or damage. We assume all unsolicited material is for publication unless otherwise stated, and reserve the right to edit, amend, adapt all submissions.

Future plc is a public company quoted on the London Stock Exchange (symbol: FUTR) www.futureplc.com

Chief Executive Zillah Byng-Thorne Non-executive Chairman Richard Huntingford !ǝǣƺǔ IǣȇƏȇƬǣƏǼ ǔˡƬƺȸ Penny Ladkin-Brand Tel +44 (0)1225 442 244

We are committed to only using magazine paper ɯǝǣƬǝ ǣɀ ƳƺȸǣɮƺƳ ǔȸȒȅ ȸƺɀȵȒȇɀǣƫǼɵ ȅƏȇƏǕƺƳً ƬƺȸɎǣˡƺƳ forestry and chlorine-free manufacture. The paper in this magazine was sourced and produced from

ON SALE 24 MARCH

sustainable managed forests, conforming to strict

Subscribe today: www.bit.ly/3dworld-subs

manufacturing paper mill and printer hold full FSC

TH

3D WORLD April 2020

114

3dworld.creativebloq.com

environmental and socioeconomic standards. The ƏȇƳ ¨0I! ƬƺȸɎǣˡƬƏɎǣȒȇ ƏȇƳ ƏƬƬȸƺƳǣɎƏɎǣȒȇِ



Issuu converts static files into: digital portfolios, online yearbooks, online catalogs, digital photo albums and more. Sign up and create your flipbook.