Designing Multi-Disciplinary User Experiences That Take Users Off the Screen


Creative Cloud

Soaring 883 feet above street level, the One Liberty Observation Deck offers 360-degree views of Philadelphia’s Center City and far beyond. After a quick, narrated elevator ride to the 57th floor, visitors encounter a super-sized Benjamin Franklin sculpture, Philadelphia-focused listening stations and multi-lingual touch screens that enable them to zoom in on a single point hundreds of feet below.

Credit: Photo by M. Fischetti for VISIT PHILADELPHIA

Next time you’re in Philadelphia, look up—way up—to see the One Liberty Place Observation Deck. The deck, located some 945 feet above ground, features a 360-degree interactive panorama of the entire city. With six touchscreens and nearly 100 interactive points of interest, the digital installation helps visitors get up close and personal with the city through stories, stunning gigapixel-class photography and unique observation.

It is just one of many interactive user experiences developed in collaboration with Stimulant, a San Francisco-based company that creates place-based digital interactions. This particular experience also involved the deck’s operators Montparnasse56 and photography partner Panogs.

“What we try to do is bring social experiences back into the physical world using technology as a conduit to connect people with one another in the same space in the same time,” said Nathan Moody, Design Director at Stimulant.

It is the merging of architecture, user interface design and experience design that sits at the intersection of reality and possibility. These types of experiences explore the possibility of making physical properties interactive and, in some cases, almost human. It is a whole new way to view the user experience, but it still very much rooted in user experience design.

“Make no mistake, every one who works here cut their teeth on the web,” Moody said. “What’s different is bridging the gaps between disciplines and deliverables.”

Moving experiences off the screen in an increasingly digital and connected world is inevitable. Stimulant’s clients range from museums and iconic national landmarks to cutting-edge and innovative companies, each seeking to create more meaningful and interactive experiences for their customers. As more people become aware of the potential to make physical spaces as rich as virtual spaces in terms of context and interaction, physical spaces will become just as much an actor in an experience as the very people who are experiencing it.

The Disappearing Computer

Exploration into the realm of smart physical spaces and user experiences isn’t new. The Disappearing Computer Initiative was a European government-funded program that launched in 2001 and was active for just a few years. Its goal was to see how incorporating information technology into everyday objects and settings would support and enhance people’s lives.

The work was based largely on three objectives that are perhaps more relevant to user experience designers today than ever before:

  • Create “information artefacts” by embedding technology into everyday objects
  • Explore how “collections” of these “artefacts” can work together to produce “new behavior and new functionality”
  • Ensure people’s experiences with “these new environments is coherent and engaging”

These early projects mimicked much of what we’ve seen developed in recent years. One project looked at embedding location-based technology into grocery store items like cereal boxes to help people navigate stores and find groceries faster. Another explored using the physics of sound control to enable appliances to communicate with humans. And this was just the beginning.

No Rules, But More Expectations

Fifteen years and countless technological advances later, designers are still in the early stages of exploring what is possible when the permanent and digital worlds collide.

“There are no best practices about doing any of this yet. Our responsibility is to educate our partners and get them excited about the possibilities versus the challenges,” Moody said.

Unlike virtual user experiences, building interactive user experiences in the physical world requires bringing outside elements into the design experience. Moody is constantly educating partners and clients around interactivity as a building material, the properties of certain construction materials, natural and traditional lighting sources, and how the physical and virtual worlds can work together to make an idea come to life.

Despite this, clients these days are often well informed and their expectations and aspirations have grown exponentially. They ask about Bluetooth low energy beacons, employ people internally to stay abreast of what’s cutting-edge, and bring examples of interactive experiences they’ve found on YouTube to help communicate their visions. Just a few years ago, these same clients were referencing Minority Report.

“They no longer need the tropes of science fiction to view what’s possible,” Moody said.

Challenges In Designing Multi-Disciplinary User Experiences

It’s not just the companies behind these experiences that are asking for more; users want more too.

“There’s been a 180 degree shift in terms of user expectation over the past eight years,” Moody said. “When we first started, you had to make it explicit for users to touch things. Touch experiences became commoditized overnight. Now everyone is going to be creating multi-touch experiences.”

Users want—and expect—experiences beyond touch, and companies are keen to deliver.
However, there is still an overall problem of awareness. Creating these types of experiences requires development and quality assurance testing that can get very complex. Various levels of augmented reality are often layered on top of an experience, making these massive and expensive experiences to develop. On top of this, these different technological components need to work together to sense when users are present or engaged in order to deliver the intended experience, something Moody calls “the socialization of technology.”

Since many of these large-scale activations occur in public or touristy places, privacy, accessibility and language barriers also pose challenges. At the One World Observatory in New York City, Stimulant addressed one of these problems by incorporating data into a visitor’s ticket. When a ticket is scanned upon entry, the visitor is welcomed to the experience in their native language and they receive insight into other visitors, able to see common places where people have travelled from around the world.

“Digital experiences in physical spaces are made or broken by the context that surrounds them,” Moody said.

This can be as complex as ensuring a user has an understanding of or familiarity with the experience’s goal, to whether or not a user’s fashion choices affect an experience. Fake fingernails, for example, do not conduct electricity, a crucial part of a touch-based experience.

Simple Does It

These experiences are turning places into spaces and changing the way we think about what constitutes an experience, all while helping brands and artists alike tell stories that allow users to create meaningful memories. At the end of the day though, no matter how complex the development or installation, these are still user experiences designed to bring people together. Slowly, they are changing the way people interact with each other, technology and the physical world around them.

“Give people a platform to share stories with one another and these emerging behaviors come forward,” Moody said. “Usually it’s the simplest of experiences that allows people to do that.”

How to Support Design Decisions Through Iterative Testing


Creative Cloud

A while ago I was tasked with designing new features for a client’s data-rich web app. The requirements came from business analysts who, together with the product owner, would talk to the client’s product manager and discuss the features that needed built.

This is the workflow found in many service-based companies, resulting in the business analysts and product managers becoming the customer needs experts due to the lack of access to the end user. The immediate consequence is their personal views become the assumed views of the customer, as there is no data to support a different perspective. Therefore, decisions are made based on what business analysts think the customer wants, rather than what they actually need.

This is one outcome of the shift towards an Agile culture that sets the focus on continuously building new features, rather than testing and researching which ones are actually worth the development effort. The opportunity cost of not investing time in research to determine what features are actually needed is much higher than a couple of days’ work to prepare low-fi concepts.

In an Agile environment, the UX designer has to constantly deliver solutions for the current iteration while thinking of the remaining items in the backlog, and preparing the concepts for the next sprint.

The Pitfalls of a Continuous Design Process

The continuous design process is a double-edged sword. On one hand it provides constant challenges to designers and keeps them focused on solving design problems, but on the other it doesn’t allow time for feature validation through proper research and testing.

The need to be on schedule and deliver at the beginning of the sprint forces designers to cut corners and to rely on gut feeling and personal experience instead of valid research data. And after a feature is released, it is assumed that the client will use it, allowing the team to jump into the next item in the backlog.

In this scenario, we could go on about the lack of support for user research, but the problem is created by Agile development practices that promote feature factories. Believing that validation will come after a release is a flawed expectation as it assumes that if you build a feature, customers will use it.

costIn many cases, even if a clients requests a feature, if its functionality slows down a user’s productivity, they will simply ignore it.

Unless designers coordinate across all teams to keep users at the center of their design and development efforts, they will most likely release features that will end up re-worked or dropped.

Unsuccessful features impose an even greater cost than the building them in the first place.

The code written to offer that functionality adds to the complexity of an app, and with each feature the cost of carrying this extra “weight” is higher.

With each new feature released, it becomes harder to modify and test an application.

Support Design Decisions Through User Research

To avoid unnecessary complexity, we must support design decisions based on both qualitative and quantitative research. All the more so if your team adheres to an Agile design and development practice. In this instance, you need to adapt your methods to suit the lack of time and resources that classic user research demands.

There is a wide range of user research methods available, and the Nielsen Norman Group has done a great job in classifying them by 3 dimensions:

  1. Attitudinal vs. Behavioral
  2. Qualitative vs. Quantitative
  3. By Context of Use

Qualitative research generates data about user behaviors and attitudes, based on observing them directly. This usually happens in usability and field studies where the researcher directly observes how individuals use a specific product.

Quantitative research collects data indirectly through methods like surveys and analytics. Quantitative research captures large amounts of data that can be analyzed to show the scale at which certain issues appear.

While quantitative research is focusing more on the behavioral dimension, the qualitative approach tries to understand and measure people’s attitudes and mental models when using a product.

Choosing which method to use, and when, is difficult due to the adaptive and evolutionary nature of Agile methods. Depending on the phase of the development process, whether it’s execution or assessment, a lean approach that uses concepts to test assumptions is recommended.

By relying on a build-measure-learn feedback loop, the team can quickly prototype a new feature, measure how clients respond, and learn whether they need to make improvements or completely re-work the concept.

The RITE Method

One of my preferred methods to test concepts is the RITE method (Rapid Iterative Testing and Evaluation), an iterative testing method in which changes to the UI are made as soon as the usability test reveals a problem for which a solution can be proposed.rite

Usually this happens after observing the behavior of the first participant and the team decides whether improvements to the concept need to be made before the next round of testing.

After that second round, the team continues to run tests and modify the prototype iteratively, until the desired result is achieved and the feature is ready for development.

The disadvantages in using this method in a pure form in an Agile context are:

  • the time necessary to make adjustments to the prototype in between testing sessions can affect the development schedule.
  • not having a clear deadline when the testing will be finished, as it depends on the type of issues found with each round of testing.

Solving User Testing Time and Scope Issues

In an iterative process, low fidelity design tasks replace the traditional design phase. By using low-fidelity wireframes in the testing sessions, much of the functionality in the concept will be incomplete.

The risk is that during the first test, the client might point out those incomplete parts as issues and disregard the areas that the researcher needs validation on.

The best way to test in with low fidelity wireframes is to keep the focus on specific sections and test them in an isolated context.

By focusing on smaller components, or even just parts of a bigger feature that will be developed in several sprints, the total number of potential issues is minimized.

Conclusion

In just a couple of days of user research, designers can identify the biggest usability problems in their concept. This activity protects the business from investing time and resources in features that distract and damage its long-term goals.

This adapted RITE method allows for testing and reiterating to be completed in just a couple of days, empowering designers to conduct tests in every sprint. By running tests on small components every two or three weeks, UX designers can align themselves to the development schedule and push design improvements faster.

July Update of Adobe Experience Design CC (Preview)


Creative Cloud

We are excited to announce the July update for Adobe Experience Design CC (Preview). With each release, we’re constantly adding new features and improving the existing ones, based on your feedback. In this July update we’ve focused on bringing you the highly requested Zoom tool, improvements to export and enhancements to defining prototype flows.

To try out these new features, download the latest version of Adobe Experience Design or update it via the Help menu. (Help > Check for Updates)

What’s New in this Update?

Zoom Tool

As a designer, having the ability to zoom in and out of your artwork is critical, especially while working on details. In this update, we’ve introduced a new Zoom Tool located in the left toolbar. You can click and drag to create a rectangular marquee selection to zoom in on a specific part of the artboard.

Zoom Tool

Marquee select to zoom

If you are working with a trackpad you can pinch to zoom. If you are working with a mouse, you can use ALT+Mouse Scroll to zoom. You can also access zoom tool temporarily while panning (via SPACE) by adding CMD modifier. This is especially useful for quick navigation and switches back to your active tool once you are done zooming.

Zoom Tool Shortcuts & Tips

Zoom Tool Z
Temporary Zoom SPACE ⌘
Zoom In ⌘ +
Zoom Out ⌘ –
Zoom To Fit ⌘ 0
100% ⌘ 1
200% ⌘ 2
Zoom in one step Single-click
Zoom out one step OPT/ALT + Single-click
Zoom Tool

Zoom Tool in action

Go To Previous Artboard

Prior to this update, there was no way to create a true “back” button in your prototype that would return a user to the last artboard viewed. With this update, we have introduced an additional property in the target dropdown – “Previous Artboard”

By using this new option you can easily create a link back to the previous artboard without having to manually locate and draw a wire to it. Then head over to the Prototype mode and select “Previous Artboard” from the Target dropdown. When returning to the last visited artboard, the previous animation will automatically reverse and the duration of the animation matches the duration of the previous one. This is another great example of how important your feedback is to our team, helping us to define the workflows you need.

Previous Artboard

Select – Previous Artboard from the Target dropdown.

Previous artboard

Using Previous Artboard you can quickly create a back button.

Export PDF

With the July update, we are expanding the ways in which you can export artboards and assets. In addition to PNG and SVG export, XD now supports PDF export.

You can now export all your artboards or selected assets as a single PDF document or as multiple PDF documents. While exporting your artboards pick PDF from the Format menu in the export dialog. This new addition to the export options, should enable you to easily share, present and review your designs with stakeholders. We look forward to your feedback on the export options to further enhance and improve it.

PDF Export

Export options now support PDF in addition to SVG and PNG.

Copy & Paste from Sketch and other tools

Now bringing in content or copy from other applications is easier. Use the copy function (Cmd+C) to copy content from apps like Sketch, PowerPoint, InDesign — or any other tool that puts PDF onto the clipboard. A simple Paste command (Cmd+V) places it into your XD file. The pasted content will be editable in XD.

We’d love feedback on this feature if you find that your content does not import correctly into XD!

What’s coming soon?

Linear Gradients

We know how important gradients are and the team has been busy working on a cool new gradient editor. We couldn’t resist, so here is a sneak peek of what’s in store.

Linear Gradients

Sneak peak – Linear Gradients.

Korean Support

A while back, the XD team visited design agencies in Korea. They spent time understanding the kind of the challenges and workflows they deal with while working on screen design projects. As a result of that effort, we are thrilled to add support for Korean in the next release.

Be a part of this Journey

Feedback

Big shout out to everyone who downloaded XD and sent us feedback. Your continued support in the form of feedback is absolutely valuable to the team. We constantly strive to build a better product that you designer will love. Visit our User Voice feedback site to suggest features, up vote ideas or report issues. As well as our Forum to participate in discussions with the community.

Social

You can follow our handle @AdobeXD for updates or reach the team on Twitter using the #AdobeXD. You can also talk to us using Facebook where we answer questions during live sessions, share videos and other updates.

#MadeWithAdobeXD

While sharing your prototypes on Behance, don’t forget to tag them with #MadeWithAdobeXD and select Adobe Experience Design under “Tools Used” for the opportunity to be featured in the Adobe XD Newsletter. Check out Sabine Schwarzkopf’s  Tralley app project on Behance designed using Adobe XD.

Events & Articles

The Adobe XD team visited UXcamp Europe #UXCE16 in Berlin on June 25/26 where attendees tested Adobe Experience Design CC. See what they had to say about the workshops

Video – Adobe XD at UXcamp Europe 2016: Parth Nilawar

Video – Adobe XD at UXcamp Europe 2016: Joachim Tillessen

Reading – What if there were a formula to create innovative products, experiences or services? One of our product managers, Demian Borba, shares a manual for innovation. http://adobe.ly/298L3d2


Please download the latest version of Adobe Experience Design and let us know what you think. Looking forward to hearing your thoughts.

Carolina Niño’s Technicolor Dreams


Creative Cloud

Carolina Niño is a Colombian-born artist who creates vivid, geometric collages inspired by natural elements. She recently created a Photoshop Tutorial on how to create a fantastical collage inspired by images from Adobe Stock. We spoke with Carolina about her journey to finding her signature style and the importance of creative space.

Carolina has always been surrounded by art and nature. She comes from a family of artists, and growing up in the Colombian countryside, nature became Carolina’s first and lasting source of inspiration. Even as a city-dwelling adult today, she frequently travels to reconnect with the elements.

After completing her education in graphic design, Carolina began her career at at agency but felt stifled by the controlled environment. Craving a change, she moved to Spain, where she lived with like-minded artists and musicians. In this new, creative atmosphere, Carolina’s artistic sensibilities began to blossom. It was a time fueled by curiosity, a period of learning. “At that time I was just absorbing, I didn’t have any style at all,” she admits.

sample 5

The concept of a creative atmosphere also resonates in Carolina’s art. In the same way that people need space to grow and thrive, so do the elements of a collage. As Carolina explains, it’s impossible to focus on a single aspect of a crowded visual, and so she places utmost importance on the perfect balance of colors, shapes, and space. In a way, “the space between one object and another is more important than the object itself,” she says.

sample 4

After her transformative years in Europe, she returned to South America with a renewed creative perspective. She picked up her first computer and started experimenting with Photoshop. Mixing and layering images came naturally to her, and she began to master the perfect balance between colors, shapes, and space.

Around the same time, she discovered Behance. “I wanted to be like those artists on Behance,” she shares, but she found the task of catching up to these established artists to be a daunting challenge. Rather than trying to emulate their success, Carolina shifted her focus to her craft, learning from tutorials and bringing together her experiences to cultivate her signature style.

sample 2

And her strategy worked – Carolina’s work has been featured in both the Illustration and Graphic Design galleries on Behance, and her projects have a combined view of over 72,000.

Carolina’s art is an extension of her experiences in life. Each piece comes together to create a complete, balanced image. “Life is like a big collage. You travel and you put pieces together to create something.”

Follow Carolina’s step-by-step tutorial to create your own digital collage with images from Adobe Stock.

Check out more of Carolina’s fantastical work on her Behance portfolio.

After Effects CC 2015.3 In-Depth: UI Changes and Other Improvements

Creative Cloud

Earlier this summer, we released After Effects CC 2015.3 (13.8), which features a variety of performance and workflow enhancements.

In this article, we’re sharing more detail about changes to the user interface, bug fixes, and the many other small improvements.

Please, if you want to ask questions about these new and changed features, come on over to the After Effects user-to-user forum. That’s the best place for questions. Questions left in comments on a blog post are much harder to work with; the blog comment system just isn’t set up for conversations. If you’d like to submit feature requests or bug reports, you can do so here.

scroll tabbed panels using the Hand tool

When a panel group has more panel tabs than you can see at once, you can now scroll the tabs by panning with the Hand tool. This helps you navigate between panels faster, especially in the Timeline panel when your project has many compositions open at the same time.

Scrolling with the Hand tool works similarly to panning the view in the Composition or Timeline panels: activate the Hand tool, then click and drag left or right in the tab well of a panel group to scroll the view of those tabs.

To activate the Hand tool, do any of the following:

  • Select the Hand tool in the toolbar.
  • Hold down the spacebar key or the H key to temporarily activate the Hand tool when a different tool is selected.
  • Click and hold the middle mouse button (often the scroll wheel on a mouse or the first button on a tablet pen) when the mouse pointer is over the tab well.

Tabbed panels will only scroll when there are more tabs than are visible for the width of the panel group. Stacked panel tabs do not scroll.

You can also scroll tabs by holding the Shift key and scrolling with a mouse scroll wheel or scroll gestures on touch input devices. Without holding the Shift key, scrolling tabs with the mouse wheel or gestures continues the existing behavior of moving from tab to tab.

additional native format support

  • Apple ProRes QuickTime files can be decoded natively on Windows, without needing QuickTime installed on the system.
  • RED camera raw (.r3d) file decoding has been updated with the newest RED SDK. New functionality includes support for the RED Scarlet-W, Raven, and Weapon cameras, including 8K .r3d footage.

other changes

  • Double-clicking on active panel’s tab, or in the fallow area of the tab well of a panel group, will maximize or restore that panel group, the same result as pressing the ` (accent grave) key with the mouse pointer over that panel group.
  • Previews with Cache Before Playback enabled no longer automatically start playback if After Effects is in the background when caching completes. The After Effects icon will bounce in the dock (Mac OS) or flash in the task bar (Windows) to indicate that caching is complete.
  • Snapping now allows 2D layers to snap to 2D layers inside of a nested composition with the Collapse Transformations switch enabled.
  • A new workspace named Libraries has been added to make it easier to view, organize, and import assets from your Creative Cloud libraries.

notable bug fixes

  • Previews no longer exhibit banding in 16bpc and 32bpc projects on Windows with Nvidia display cards when the Hardware Accelerate Composition, Layer, and Footage Panels option is enabled.
  • Dragging with the Unified Camera Tool in the Composition panel now refreshes the view faster.
  • Region of Interest now displays the contents inside the region as you draw.
  • 3D Camera Tracker no longer reframes the layer when Region of Interest is enabled.
  • Viewing the alpha channel or straight alpha in the Composition or Layer panel again displays the alpha channel as black, instead of the composition background color, when the Hardware Accelerate Composition, Layer, and Footage Panels option is enabled.
  • ARRIRAW files can again be imported on Mac OS.
  • Adobe Stock still image preview assets are now automatically replaced with the full-resolution, non-watermarked version when you purchase a license.
  • After Effects again automatically recognizes changes to the dimensions of a .psd file.
  • Animation presets included with After Effects that include a mask no longer fail to reference the mask path correctly.
  • Auto-save no longer saves the project during a preview. If the timing of auto-save would have occurred during a preview, the project is saved when the preview is stopped.
  • Auto-save no longer saves the project when After Effects is run from the command line.
  • Scrolling tabs in a panel group with the mouse wheel or other gestures now scrolls in the gestured direction, instead of the inverse.
  • Scrolling in the expression editor in the Timeline panel no longer exits the expression editor when the scroll reaches the top or bottom.
  • Panning the Graph Editor view with the hand tool works again, including with the middle mouse button.
  • Resetting the Time Remap property no longer produces an error message, “internal verification error, sorry! {asked for invalid component of property path}”.
  • Clicking with the eyedropper on a solid in the Project panel, on either the solid’s icon or the thumbnail at the top of the panel, now returns the expected color.
  • Pausing the render queue no longer sometimes causes the current render item to stop and the next item to start.
  • Migrate Previous Version Settings now does not fail when you save the open the project before quitting.

What It Means To Be A Designer’s Developer

Creative Cloud

As an Experience Developer at Adobe Design, our central design organization, I play a unique role during product development. Sure I perform all the standard engineering duties: I implement features, I review pull requests, I fix bugs. But what differentiates my job is that I develop in the service of design and user experience. I see it as my responsibility to do what’s necessary from an engineering standpoint to best support the design team throughout the development process. I describe myself as a ‘designer’s developer.’

What is a Designer’s Developer?

As a ‘designer’s developer’ I wear many hats. In addition to developing, I can be a negotiator, detective, design advocate, and co-conspirator during any given project. I help communicate the design team’s intentions, serve as their voice in engineering meetings, help push their ideas forward and generally serve as a bridge between design and engineering. So what exactly does that look like? It means a lot of listening, a lot of building, and often a lot of throwing code away.

Experiment, Explore, and Experience Through Prototyping

In the early stages of a project I spend a better part of my time collaborating closely with designers prototyping various interaction models.  Sometimes I build just a couple; sometimes I build as many as 10 or 20. In my mind, during this phase of development my main responsibility is to say ‘yes’ to any and all design requests.  Any idea the designers want to test out I’ll build for them with (almost) no argument. In the beginning designers have a wealth of ideas and even more questions. They need to experiment, explore and experience as much as they can to help answer those questions and discover what works and what doesn’t. Prototyping is my way of contributing to this discovery process. The more I prototype, the more the design team learns and the better the end product will be.

Build with a Focus on Detail

When designs become more finalized and engineering begins in earnest I spend more and more time on the nuts and bolts of development: implementing features, sprint planning, fixing bugs etc. I work mainly on front-end features and augment the development team, helping out however I can. I also focus on details; the nuances and smaller moments that go into user experience such as animations, transitions, look and feel, and general polish.  Because of time and resource constraints these design details can all too often fall through the cracks. It’s one of my main responsibilities to fill those gaps and ensure the design team’s priorities are met.

Push Back to Push Forward

Being a ‘designer’s developer’ doesn’t mean always implementing everything design wants. Sometimes the designs aren’t technically feasible or are too invasive and destabilizing to the build. It’s never fun pushing back but designers know they can trust me. They know we have the same priorities, that I want the best possible end user experience and will do whatever I can to help realize it.

Embrace Your Interests to Find Your Niche

At the end of the day I’m a developer. I love building applications, devising unique solutions to problems, and sleuthing out the causes of nasty bugs. I respect and strive for the rigor and skill it takes to build a solid and robust application. But at heart I’m a ‘designer’s developer.’  Through supporting the design team I have the opportunity to use code as a tool for exploring a wide array of fresh and original ideas. It’s a real joy and thrill to quickly prototype an idea and see it come to life just to start on a completely different concept the next day. Designers constantly push the envelope and challenge me to push the envelope with them. I also embrace the complex task of bridging the divide that exists, for better or worse, between design and development. Most rewardingly, when all is said and done, I play a key role in creating something useful, enjoyable, and beautiful for people.

It took me a little while to find my niche as a software engineer. I first had to realize not all engineers are cut from the same cloth. Just because I wasn’t terribly excited about encoding algorithms didn’t mean I didn’t have a place in the industry. Next I had to discover what sort of engineering problems I was most interested in and passionate about solving. I had always been drawn to user experience but not until I started working closely with designers and, eventually, within design organizations did I find where I belong.

After Effects CC 2015.3 In Depth: Maxon CINEWARE and Cinema 4D Exporter Updates

Creative Cloud

Earlier this summer, we released After Effects CC 2015.3 (13.8), which features a variety of performance and workflow enhancements.

In this article, we’re sharing more detail about the updates to the Maxon CINEWARE 3.1 plug-in, which includes the Maxon Cinema 4D Exporter.

Please, if you want to ask questions about these new and changed features, come on over to the After Effects user-to-user forum. That’s the best place for questions. Questions left in comments on a blog post are much harder to work with; the blog comment system just isn’t set up for conversations. If you’d like to submit feature requests or bug reports, you can do so here.

Maxon CINEWARE 3.1 plug-in

Maxon CINEWARE 3.1.14 includes the following features:

  • improved Cinema 4D Exporter, including support for animated 3D Text and Shape layers
  • enhanced OpenGL rendering
  • bug fixes

Cinema 4D Exporter: animated 3D text and shape layers

The Cinema 4D Exporter (File > Export > MAXON Cinema 4D Exporter) now exports 3D text and shape layers into the .c4d file.

3D Shape layers are exported as extruded spline objects, and includes animation of shape layer properties.

When you export a composition that contains 3D text layers, you will be prompted to choose between exporting extruded spline objects or extruded text objects:

  • Extrude Text as Shapes
  • Preserve Editable Text

The Extrude Text as Shapes option exports 3D text layers as extruded spline objects in the .c4d file. This option retains the fidelity of the layer: character and paragraph formatting, and animation of text layer properties. The font and text content can not be modified in Cinema 4D.

The Preserve Editable Text option exports 3D text layers as extruded text objects in the .c4d file. The font and text content can be modified in Cinema 4D. This option has limited support for character and paragraph formatting, and animation of text layer properties. Text animation features that are not supported include: text animators, kerning, tracking, vertical text, paragraph text, and text on path.

Strokes are exported for 3D text and shape layers into the .c4d file. Note that while the Ray-traced 3D renderer in After Effects does not render strokes for 3D text layers, strokes will still be exported when enabled. To view 3D text layer strokes before exporting, make sure that the composition renderer is set to Classic 3D.

Other Cinema 4D Exporter improvements

To improve the workflow between After Effects and Cinema 4D, the Exporter now shifts the scene coordinates for the parent null object so that the center of an After Effects composition matches Cinema 4D’s center at 0,0,0.

Also, when you import and add to a composition a .c4d file that was created by the CINEWARE 3.1 version of the Exporter, you can view the scene through an After Effects camera by first adding a camera and then setting the Camera setting in the CINEWARE effect to Centered Comp Camera. Any After Effects 3D layers you add to the composition will now line up with the Cinema 4D scene layer. Extracted 3D Scene Data from the .c4d file like nulls, cameras, and lights will also line up, provided that any new objects added to the .c4d file are grouped under the same parent null object as created in the exported .c4d file.

The default name for the exported .c4d file is now the name of the composition, instead of “untitled.c4d”.

Exported .c4d files are saved as version 17.0.

Enhanced OpenGL rendering

The OpenGL renderer in CINEWARE now supports the same level of Enhanced OpenGL quality as Cinema 4D for these properties:

  • Transparency
  • Shadows
  • Post effects
  • Noises

Photoshop Masterclass with Aaron Nace, live on Twitch.tv/Adobe


Creative Cloud

Tuesday July 19, Wednesday July 20, and Thursday July 21 from 9AM to 12PM (Pacific Time).

Join Photoshop Master Aaron Nace from PHLearn for his first live-stream tutorials exclusively on the Adobe Twitch Channel! Aaron has been teaching photography and photo manipulation to millions of users across the world at every skill level, and he has been committed to sharing his methods of creative conceptual photography both online and in-person.

PHLearn just launched Photoshop 101-301, their longest, most comprehensive, and fun Photoshop tutorial ever. Follow along live on Twitch and interact with Aaron as he takes you through parts of his new tutorials and helps you learn Photoshop.

July 19: Photoshop 101

Start your Photoshop journey by covering the basics that will help you build the best habits and prepare you to be more proficient from the get go. Aaron will get you set up with a custom workspace to best suit your needs. You will also learn about color profiles and how to get the most out of your images.

July 20: Photoshop 201

Photoshop offers many options when it comes to adjusting your layers. Aaron will take you step-by-step through the most useful adjustments. What’s more is you will learn how to make Photoshop work for you—saving you time to create more awesome images.

July 21: Photoshop 301

Photoshop provides the tools to edit much more than just photos and images. On day 3, Aaron will be sharing his more advanced Photoshop techniques and tools, working with smart objects, typography, in 3D.

After Aaron’s Master Class, the day continues with an exclusive Street Photography Workshop by Erick Kim and a Video Project (Animated Album Art) with DJ Cutman.

For the full schedule check the Adobe Twitch Channel.

Button-Live-Channel

Join us on Twitch.Tv/Adobe on July 19,20 and 21!

Diving Into the Essential Sound Panel in Audition CC


Creative Cloud

A year ago, Adobe Premiere Pro released a feature called the Lumetri Color Panel which concentrated the incredible color correction tools in the rather complicated Adobe SpeedGrade application into a very approachable set of tools accessible to almost anyone.  Expert-level color manipulation and calibration workflows had previously been a foreign language to many editors, but the intuitive tools and friendly labels of the Lumetri panel helped all of us become more familiar and embracing of consistent, dynamic color in our productions.

In chatting with customers, those of us on the Audition team kept hearing a similar call to simplify audio production. Editors working on tight deadlines with shrinking budgets didn’t always have the luxury of sending a project off to an audio professional for mixing, and often when they tried to do it themselves, they got lost or did not know how to achieve the results they wished for. With that in mind, we created the Essential Sound Panel which is available in the latest release of Adobe Audition CC.

The Essential Sound Panel (or ESP as one of our pre-release users cleverly named it in awe upon first trying it out) guides editors through standard mixing tasks for Dialogue, Music, Sound Effects, and Ambient or Environmental audio content. DME (Dialogue, Music, Effects) is a common audio mixing and deliverable methodology ensuring each element of a production’s sound is mixed and has appropriate effects for its role in the project.  In the ESP, selecting one or more clips and assigning a Mix Type – the role those clips play in the production – exposes several effects appropriate for that type.  Each group of effects and controls are placed in the natural order an audio expert would follow such as first ensuring all content was at a standard, consistent level.  For dialogue recordings, the next step might be to clean up background noise or electrical hums before adding compression and EQ.  For music, the next step might be to stretch or leverage Adobe Audition Remix to recompose a music clip to match the duration of your video project.  Let’s dive in and take a closer look at what each Mix Type offers and how to customize the panel for your own organization or project.

The Essential Sound Panel offers mix type-specific operations for Dialogue, Music, Sound Effects, and Ambience recordings

The Essential Sound Panel offers mix type-specific operations for Dialogue, Music, Sound Effects, and Ambience recordings

Dialogue

Dialogue clips may consist of recordings captured during a performance, ADR recorded in a studio or off-set at a later time, or voiceover and narration.  While each might play a different role in your production, the techniques applied to bring out clarity and improve definition are very similar.  The initial goal is to get all content at the same starting point in terms of loudness, get rid of any anomalies, bring out the best elements of the captured recordings, then add that bit of creative spark that helps it exist within the visual environment.  Finally, there may be some slight adjustments to particular elements to emphasize them above the others.

The first step is to Unify Loudness, and Audition accomplishes this by adhering to the ITU Loudness standards.  Loudness is different from volume in that it’s a measure of average volume consistency over time, with very specific definitions to ensure one piece of content will play at the same levels as a piece created by another organization, and is the centerpiece of recent regulation around the world for broadcast material.   When you select a clip and assign it a mix type, Audition automatically analyzes it and calculates its original loudness value.  Selecting the Auto-Match button applies a non-destructive clip gain adjustment so that all Dialogue clips will sound the same loudness.  While you can enable Auto-Match with one or dozens of clips selected, each clip will be analyzed and adjusted separately to ensure consistency.

Auto-Match ensures all clips are at the same initial volume. Clips in the timeline will reflect the adjustment and the mix type assigned with a matching icon badge.

Auto-Match ensures all clips are at the same initial volume. Clips in the timeline will reflect the adjustment and the mix type assigned with a matching icon badge.

Next, you may need to do some light restoration such as removing noise, microphone rumble, or removing the harsh “s” sounds (aka sibilance) that may have been picked up during the recording process.  Select the Repair Sound group header to expose these parameters.  You can simply grab a slider and move it to enable each effect and adjust the amount applied.  If you’ve already enabled an effect and wish to hear your clips with it off or on, click on the checkbox next to each effect name.

Reduce background noise, microphone rumble, electrical hum, or harsh sibilance using the deceptively simple sliders.

Reduce background noise, microphone rumble, electrical hum, or harsh sibilance using the deceptively simple sliders.

While the Essential Sound panel abstracts these complex effects and tools as simple sliders, they are actually mapped to Audition’s powerful, native effects, controlling several parameters at once to ensure great results without getting lost among the plethora of parameters many of our effects offer.  If you open the Effects Rack and select Clip Effects at the top, you can see exactly which effects the ESP is adding or removing as you toggle the checkboxes on and off.  And if you’d like to see precisely what the ESP is doing, you can open the effect UI and watch it adjust as you change the slider.

The simple slider is actually controlling many effect parameters, non-destructively, under the hood.

The simple slider is actually controlling many effect parameters, non-destructively, under the hood.

Once you’ve cleaned up the sound, it’s time to make it pop!  This is typically achieved through the use of Compression and Equalization.  Equalization, or EQ, is the emphasis or reduction of certain frequencies that can help audio content fit well with other material or make it more pleasant to our ears.  Compression is the subtle adjustment of loudness, often limited to certain frequency ranges and with specific rates of modification, that can make a recording feel more professional and easier to listen to.  The ESP relies upon Audition’s quite incredible Dynamics Processing effect which allows for compression (reducing the dynamic range of a recording) and expansion (increasing the dynamic range – essentially making the difference between the loudest and quietest bits of a recording even further apart.)  With each clip you enable Dynamics in the ESP, Audition analyzes the clip to determine the optimal RMS threshold values and assigns these to specific values on the adjustment envelope.  Still with me?  All that really means is we take the guesswork out of finding the right settings for each individual clip and concentrate what is possibly our most complicated effect down to a simple slider.

After analyzing each clip, every instance of the Dynamics Processing effect is adjusted for the unique values of each recording.

After analyzing each clip, every instance of the Dynamics Processing effect is adjusted for the unique values of each recording.

Adjusting the EQ enhances the frequency curve of your clips which can make subtle improvements, emphasizing the bits and pieces that the human ear has evolved to hear best, or to make dialogue sound as if it exists in different environments.  Select from our simple and fun EQ presets and adjust the slider to affect how much your sound is modified.

Audition's Essential Sound Panel ships with several presets to help you achieve the sound you want.

Audition’s Essential Sound Panel ships with several presets to help you achieve the sound you want.

Which leads naturally into adding a little creative Reverb to your recordings and give them the feeling that they are actually happening in the room, church, alley, or other environment that may be in your video.  Audition again offers several presets for different locations, which you can adjust to match the mood and setting of your project.

Dx Reverb Presets

If you find a combination of settings that you’ll want to go back to often in a project, or maybe just want to try a few pre-built configurations to start exploring, each Mix Type offers a selection of ESP Presets at the top.  You can select and apply any of these to your selected clips quite easily, so John’s recording can always sound like John while Bernadette’s clips always sound like Bernadette.  There are some I’d specifically like to call out as they are very handy for easily changing subtle characteristics in recordings so they better match the apparent distance to the actor on-screen.  This can be especially handy when recording ADR, where each take is up close and consistent, but the video footage has a mix of close-ups, distant, and medium shots.  Make Close Up, Make Distant, and Make Medium Shot applies a subtle Reverb and EQ curve so that a single recording can quickly give the impression of any of those distances from the camera.

ESP Mix Type presets automatically profile the parameters and effects to achieve consistent sound even easier!

ESP Mix Type presets automatically profile the parameters and effects to achieve consistent sound even easier!

Music

When assigning the Music mix type to a clip in your timeline, you’ll have fewer options than a Dialogue clip.  Most music comes premixed and mastered, so the most common task is making sure it fits the length of your project.  The Essential Sound Panel offers two modes to help achieve this goal.

Stretch your music tracks to fit them to your scene, either speeding them up or slowing them down, or use Audition's Remix functionality to automatically cut and compose a new version of the song!

Stretch your music tracks to fit them to your scene, either speeding them up or slowing them down, or use Audition’s Remix functionality to automatically cut and compose a new version of the song!

When opening the Duration group, you’ll see an option for which method to use to adjust the duration.  Stretch applies a real-time stretch or compress of your song, which will modify the tempo.  By default, we perform a real-time stretch so that playback can be heard immediately.  If you stretch more than a few percent, you may wish to enable Rendered Stretch which will take a bit of time to perform, but will usually sound significantly better.  You can toggle this by opening the Properties panel, opening the Stretch group, and selecting Rendered.  The other method relies upon Audition Remix technology, which was released in the Fall of 2015 and has been incredibly popular.  Remix analyzes a song, any song file, and creates a web of potential edit and crossover points, then shuffles the segments of your song to create a new composition that closely matches your target duration.  While the full effect in the Properties panel offers a half-dozen additional parameters to fine-tune the mix, the Essential Sound Panel allows you to select whether you want Remix to favor shorter segments – and therefore more transitions – when recomposing your music track.  (Of course, you can always open the Properties panel and make additional adjustments to the Remix effect!)

Sound Effects (aka SFX)

Sound effects, and the next mix type, Ambience, are closely related and are often grouped together when mix engineers provide final audio for a film or broadcast production.  When we were planning the Essential Sound panel, however, we decided to break them apart as we “see” them quite differently.  We felt that SFX were generally shorter bits that correspond to some action or event on-screen, while Ambience has more to do with the overall environment and location and may be more subtle than other sounds.

When clips are assigned as a SFX mix type, options are exposed to add reverb – with environmentally-applicable presets, as opposed to the presets available in the Dialogue display – and sounds can easily be panned around the stereo field, to correspond with their position on-screen.  For instance, if a door is opened on the left side of the video frame, I can add Door Opening Sound 01.wav to my timeline at the right position, assign the SFX mix type, and adjust the Pan slider to the left to quickly place the sound so that it comes from the left speaker.  (There is also a SFX Preset, From the Left, which makes this even easier!)

Add reverb or adjust a sound effects position in the stereo field to correspond to it's location on-screen

Add reverb or adjust a sound effects position in the stereo field to correspond to it’s location on-screen

Ambience

Often referred to as Room Tone or Environmental sounds, these are recordings which give a sense of presence in a scene, but don’t necessarily correspond to events or objects on-screen.  Think of a scene set in an outdoor Parisian cafe, where we might hear the murmur of other conversations, the sound of delicate coffee cups clinking, and perhaps the far-off grumble of the occasional Vespa scooter in the background.  In an office environment, we may hear the air conditioner or the subtle buzz of terrible overhead fluorescent lamps.  In addition to the common Loudness and Reverb parameters you’ve seen with the other mix types, the Ambience group offers the ability to adjust the Stereo Width of your clips.  This allows you to expand or decrease the “depth” or immersion of those background recordings, and make them feel as if they are actually occurring all around the listener, or are very focused, even claustrophobic.  In the examples above, the “cafe customer” ambience would probably be very wide and happening all around us, while the unnatural hum of those lamps may be focused and drilling directly into the viewers brain to emphasize the horrible environment.  (Are you picking up my general loathing of these lights?  Good.)  For additional control, you can open the Stereo Expander effect that is inserted into the Effects Rack and adjust where that sound might be coming from.

The Ambience mix type allows you to adjust room tone characteristics of a recording, such as the reverb or stereo immersion.

The Ambience mix type allows you to adjust room tone characteristics of a recording, such as the reverb or stereo immersion.

But what ELSE can the Essential Sound Panel do?

You might have noticed that several times, I showed a native Audition audio effect plugin being modified by the ESP sliders.  It’s true!  The ESP relies upon the award-winning DSP and patented algorithms in each of our audio effects.  If an editor builds a preliminary audio mix, but the project is handed off to a professional audio engineer, that engineer can simply close the Essential Sound panel and adjust the native effects as if they had built the project themselves!  But what’s really enticing for that audio professional, the one who insists that no video editor should ever touch the audio because only THEY know the right settings to achieve the sound that they want, is they can completely customize almost every one of these parameters with their own personal settings and presets!  Behold Master Template View!

The Master Template view enables an audio professional to completely customize the parameters and presets for their team.

The Master Template view enables an audio professional to completely customize the parameters and presets for their team.

Adjusting the Master Template allows an audio expert to set the configurations or presets for almost every effect and parameter available in the ESP.  These can be used to create project or organization specific presets and defaults, and easily shared with every member of a team, further ensuring consistent results between editors.  A year ago, there was a bit of buzz as the top audio engineer with National Public Radio (NPR) shared some details about their signature sound.  While it always helps to start off with microphones that cost several thousand dollars apiece, they augment those great recordings and skilled presenters with very strict EQ and compression profiles.  A journalists job is to tell stories and share the news, not to be an audio engineer.  With a bit of work setting up these presets once, they can be distributed quite easily to every desktop with Audition installed, and customized for specific projects – even specific actors or talent – with the results sounding consistent no matter who is assigned the task of editing and mixing.  I’ll cover the Master Template editing and distribution in a later post, but please leave a comment if this is something you’d like to learn more about before then.

Alright!  How do I get started?

Download the latest version of Adobe Audition CC!  The Essential Sound panel is available in the Window menu item, and is docked by default in several workspaces including the new Essential Video Mixing workspace.  Open an existing session, send a sequence from Premiere Pro via DynamicLink, or start recording a brand new one.  Select a clip and assign it a mix type in the Essential Sound panel, then get started mixing!