Customer Intelligence and the Automated Digital Storefront


Marketing Cloud

As I watched the news streaming in from F8, the Facebook Developer Conference, one announcement stood out to me above all the others: the introduction of chatbots (essentially, programs that use natural language processing to understand and respond to conversation with humans on the internet and take certain actions in response; more on chatbots here) in Facebook Messenger.

The Messenger platform now boasts 900 million Monthly Active Users, and will be open to businesses as a new channel for automated commerce, customer service interactions, and more. Chatbots represent a new “automated” digital storefront; from ordering a pizza, to scheduling a test drive, to getting help with a software product, the goal for marketers is to allow consumers to have conversations with brands anywhere in the world, from the comfort of their own fingers. Additionally, the emergence of chatbots on Messenger (and elsewhere) promises the ability to offer a personalized, consistent brand experience at scale. What a time to be alive, right?

The reason that interest in chatbots is exploding is that natural language processing (NLP) technology is getting good enough to make “automated conversation” viable. I got my first in-depth taste of NLP last year when I worked with the research team at Adobe to develop DataTone, an application demoed at the Sneaks session at Adobe Summit 2015 that takes analysis questions (e.g. “How much revenue did the spring campaign generate for women’s shoes?”) and turns them into Adobe Analytics queries, then returns an answer to the user (“The spring campaign drove $24,491 for the women’s shoes category.”). Imagine that same ease of use, but for everything else in life!

That’s far from all, however; in fact, the dramatic shift in customer experience might not even be the biggest opportunity for marketers resulting from this exciting new technology. Looking at it from the perspective of customer data, given the widespread use of messaging apps, what I see in the emergence of chatbots is a tremendous boon nearly on par with the emergence of social media last decade.

Why ISN’T This Just Social Data 2.0?

Messaging data will be fundamentally different from both social data and survey data in a few key ways that are important to understand, since you may have social data and survey data flowing into your customer intelligence efforts already. Chatbots certainly won’t replace social media or surveys in your analysis, but will rather enrich the picture of your customers to which these sources are already contributing.

Smiling beautiful woman texting with her phone in the garden

Let’s start with social data. Unlike social media, chatbots represent a direct brand interaction. I’m not posting a picture of myself in the restaurant; I’m talking to the restaurant directly. As such, the way I talk—the things I reveal about myself, either explicitly or implicitly—will be very different. Another important difference between social data and messaging data is that social media allows me to broadcast an opinion to my network, whereas messaging is one-on-one (or perhaps a-few-on-one). Again, this will impact the signal that I, as a consumer, provide to a brand. A customer might try using snark to tweet-shame his or her ISP for having slower-than-advertised download speeds, but when that same customer is one-on-one with an agent (human or otherwise), the tone of the conversation might be calmer and more conciliatory—or, of course, it might be even angrier.

Now, let’s talk about surveys. Surveys are typically served to a cross-section of a population (e.g., your customer base), and rely on the customer to remember a previous interaction once it is complete (“How did you like the product you bought?”). Chatbots, on the other hand, allow a brand to scale infinitely (for all intents and purposes) to have real-time engagement with a customer or prospect. No sampling required—chatbots can engage with every single person who wants to talk to a brand for any reason. Additionally, engagement with a chatbot indicates a signal of interest—in buying a product, receiving help from customer support, etc.—whereas surveys are typically randomly served, often to customers or prospects whose intent at the time may or may not be related to the survey topic. Lastly, the ability to get feedback and insight in real time, as the interaction is occurring, will change the content and tone of the feedback. A customer may express dislike of a certain option in a live messaging context, but not even remember disliking the option when he or she receives a survey two weeks later.

The New ABCs of Customer Intelligence

Over the past 20 years, marketing has gotten really good at understanding customer behavior across channels. Knowing what your customers did—whether online, in store, on the phone, etc.—is, of course, key to understanding your customer. However, it is only part of the puzzle. Last year, we added the Customer Attributes capability to Adobe Analytics, which allows you to merge attribute data from a CRM or other customer data source with behavioral data. Knowing demographics, psychographics, lifetime value, propensity to purchase, etc. tells you a lot about who your customers are and how to market to them—especially when combined with behavioral data.

Wrong kind of bots, I think.

Now let’s consider chatbots. These are brand-specific interactions. You’re not tweeting about the brand. You’re not walking through a crowded mall. You’re literally talking to the brand about what you, as a customer, want or need. Unlike on-site chat, there’s no friction; the engagement happens within a brand-agnostic app where users spend a ton of time already. There’s no waiting for someone to respond to your tweet; it’s nearly instantaneous. And, unlike in-store or call center touchpoints, it’s already digital. With each message, your customer is giving you direct, decipherable signal about who they are.

As it did with analytics in DataTone, natural language processing in a chatbot takes what can be an overwhelming task (“Uggghhhh, I can’t figure out how to add more toppings to my pizza on this coupon”) and makes it as simple as sending a text message or talking to Siri (who is, herself, an NLP application). In a messaging context with a brand, a simple chatbot conversation might look like this:

Brand: Hi, how can I help you today?
Me: I’m taking a trip to Hawaii and I need snorkeling gear for my family.
Brand: Sounds fun! We’ve got a handful of great options for kids and adults. How heavy duty do you want to go?
Me: It’s our first time in Hawaii; I’ve never really snorkeled before. What do you recommend for novices?
Brand: Hmmm. We’ve got three snorkels for surface-level exploration. Do any of these look good?
Me: I think my wife would like these blue ones a lot. How are the reviews on them?
Brand: Very positive. An average of 4.6 out of 5, with 438 reviews. Would you like to buy them?
Me: Sure, these seem great.
Brand: Awesome! Click over to our web site where I’ve placed this product in your cart already.

I’m not actually going to Hawaii—I only wish I were. But look how much signal about me is contained in just the three responses! In a direct conversation with this brand, I made it clear that I have a family. We’re going on vacation to Hawaii. I’m fairly new at sea exploration. I’m reasonably articulate (thank you, thank you). I care about product reviews. My wife likes the color blue. Imagine adding all of that quantitative and qualitative insight to my customer profile, overlaid on top of the attributes you might already have for me, and any behaviors I might have performed on any of your other digital or non-digital channels.

It remains to be seen exactly how data from chatbot interactions will be made available through Facebook and other platforms. One of the big concerns, of course, will be privacy; chatbot conversations are meant to be private messages like any message you would send through these platforms. However, there is likely a happy medium to be found, where brands can capture tremendous marketing value from NLP data, and where consumers can maintain trust both in the brands themselves and in the platforms through which they are interacting.

The ABC combination of Attributes + Behavior + Chat is going to make one amazing triumvirate for customer intelligence.

Bring On the Bots!

In the quest for scalable customer intelligence, the development and productization of reliable NLP technology—the automated digital storefront—stands to become a huge boon to marketers, potentially changing the quality and depth of customer understanding forever. While we don’t yet know how/when/where messaging data will be fully available to marketers by various platforms, including Facebook Messenger, the combination of attribute data, behavioral data, and chat data stands to enhance marketers’ views of their customers in ways that we’re only beginning to imagine. So I say bring on the bots! “Hello, Delta Airlines chatbot. I think I’ll have that trip to Hawaii now. . .”

The post Customer Intelligence and the Automated Digital Storefront appeared first on Digital Marketing Blog by Adobe.

The 2nd-Party Data Continuum: “Data Sharing — It’s So Hot Right Now”


Marketing Cloud

Second-party data is so hot right now. Various publications have claimed 2016 as the year 2nd-party data goes mainstream, and as a DMP (data-management platform) practitioner, I have to agree with that assessment. But, as we will explore in this post, all 2nd-party data is not created equal.

Read: Is 2016 the year marketer’s embrace data sharing?

As marketers look to reach new audiences that are similar to their best customers, they have to rely on external datasets to find those audiences. And, as they look outward, they have to make a choice between scale and access. Do I use data that is easily accessible but also available to my competition? Or, do I put in the effort to find exclusive datasets that provide inherent advantages?

That choice between scale and access is not binary. The 2nd-party data ecosystem is a continuum, with the characteristics of 3rd-party data at one end and 1st-party at the other:

Image - 2nd Party Continuum In -Text 1

Before we explore the rise, and subsequent stratification, of 2nd-party data, let’s review why the trend is picking up steam now.

3rd-Party Data Is Everywhere

Buying 3rd-party data is a proven way to reach new customers. It aggregates behavioral and intent signals in a way that no marketer could do on their own. But, there is also a lot of it. Every DMP or DSP (demand-side platform) worth its salt offers seamless access to 3rd-party data, and typically, dozens or hundreds of data sources. The concern is no longer access but choice. How do I decide which provider to work with?

Making this decision more difficult is the fact that many 3rd-party providers pull their data from the same sources. Take “in-market” data, for example. The vast majority of this data comes from pixeling highly trafficked websites to collect behavioral signals. A person searching for “Bermuda” is in market for a cruise. A person reading an article about the Tesla 3 launch is in market for an electric car (and “green energy” and “healthy living” and “social causes,” and a number of other marketable data traits).

If you’re curious about where all of this in-market, or behavioral, data is coming from, download the Ghostery browser plug-in.

Download Ghostery

For many large publishers, you’ll see between 30 – 50 tags on the site, with many of these beacons designed to collect data signals for advertising.

Image - 2nd party Contimuum In-Text 2

44 trackers collecting the same data and offering it in different packages. It’s the same data! Doesn’t anybody notice this? I feel like I’m taking crazy pills (sticking with the whole Zoolander-theme)!

Assuming you can source the right 3rd-party dataset and use it to power your advertising, you are then confronted with the reality that your competition has the same access and may be bidding against you for the same inventory.

The Rise of 2nd-Party Data

The desire for more accuracy, transparency, and competitive differentiation fuels the 2nd-party data economy. The easiest way to think about 2nd-party data: it’s somebody else’s first-party data.

An airline knows who its rewards customers are. They know their business travelers. For a credit-card company, this is very valuable information, and they would love to market their “travel rewards” cards to known business travelers. If the airline and the credit-card company are partners, sharing this information makes a ton of sense. This is 2nd-party data sharing — the original, pure form of 2nd-party data.

Beyond this simple swap (Advertiser A and Advertiser B share data), there is a spectrum of use cases where two entities make their owned data available to one another:

  • Publisher 1 and Advertiser A share data
  • Publisher 1 shares its data with Advertiser A (i.e., Advertiser A spends a lot of money with Publisher 1 and negotiates data for an upfront buy)
  • Advertiser A sells its data to Advertiser B
  • Publisher 1 sells its data to Advertiser A

The new, key concept here is monetization, and it is typically driven by a publisher that is looking for a way to monetize its data assets. That distinction is critical — because we now see 2nd-party data packaged and sold like 3rd-party data.

Selling 2nd-party data in this way will certainly make it more available but will also drive down the value as it becomes more and more commoditized (and less about relationships).

A publisher may allow a data aggregator to pixel their site and collect data. An advertiser can then buy that data from the data seller. But, what if the advertiser already has a relationship with the publisher? Can they access that data directly without having to buy it through an intermediary?

The 2nd-Party Data Continuum

If we begin to break down the 2nd-party data section of the previous chart, we must compare availability and exclusivity and understand whether the intent of the data share is monetization or relationship-building:

Image - 2nd Party Continuum In-Text 3

2nd-Party Data at Scale

Marketplaces that enable the sale of 2nd-party data do so in a way that is modeled off of the 3rd-party data business: low friction, high discoverability, with the ability to transact online.

Data sellers profit from an open marketplace but at the risk of devaluing their data as access, control, and scarcity are sacrificed for scale.

Private Marketplace

To preserve control and establish a tighter relationship with potential 2nd-party partners, some data marketplaces offer the ability for sellers to privately transact. In this workflow, the listing is typically unbranded (e.g., “in-market travel data from top domestic airline”), and the data provider has the ability to approve buyers before their identities are known.

One to One (1:1) Relationships

At the furthest end of the 2nd-party continuum is a 1:1 data exchange between two parties. This can be facilitated with technology, such as a DMP, but is typically based on a separate partnership agreement, and the data is not listed in any marketplace for sale.

This is certainly not a scalable model, but the benefit is that the exchange is based on a trusted business relationship, and the data being acquired is more scarce and usually a better compliment to the marketer’s own 1st-party data.

I’m Evaluating Ad Tech — Why Does This Matter to Me?

When looking at the ad-tech landscape — and specifically, the DMP industry — it’s clear that the major providers are gravitating toward certain points on this data continuum. Some DMP vendors are focusing their solutions around the reach and access to 3rd-party data. Others look to link 2nd-party data across their clients to make partner data widely accessible.

The Adobe Audience Manager platform is built on the philosophy that a client’s 1st-party data is its most valuable asset, and sharing that data should be done with trust and discretion top of mind. 2nd- and 3rd-party data should be accessible to those who want it — made available by those who choose to share it — but monetization and distribution should take a backseat to governance and control.

The selection of your technology vendor should be dictated by how well its approach aligns with your data-management needs and data-ownership philosophy.

The post The 2nd-Party Data Continuum: “Data Sharing — It’s So Hot Right Now” appeared first on Digital Marketing Blog by Adobe.

Mobile at the Center of Innovation — Mobile World Congress Trends: Part One

Marketing Cloud

At the end of February, I was in Barcelona, Spain, for the 2016 Mobile World Congress (MWC). The theme was “The Edge of Innovation.” With 101,000 attendees, the MWC brought together industry leaders to talk about the future of mobile, connectivity, virtual reality, and more. It was also Adobe’s re-entry into the MWC, as I was able to announce the release of Adobe Experience Manager Mobile (AEM Mobile) — Adobe’s newest solution for building and managing beautiful apps and delivering engaging, data-driven experiences.

Since the conference, a few things have really stuck with me. In this post, I will cover some of this exciting trajectory of connected experiences, Internet of Things (IoT), and mobile apps. In part two, I will highlight virtual reality and changes in the way we interact with our devices.

The Future of Connectivity

The future looks more connected than ever before, as once-siloed industries are starting to merge. Partnerships are forming that will influence how we all relate to each other and communicate through our devices. Innovation and creativity are driving us toward a future where smartphones read gestures, where devices allow you to control both your television and your refrigerator, and where connectivity reigns — even for tasks as monotonous as fueling or charging your vehicle.

According to the analyst firm, Gartner, a total of 5.5 million new things will be connected every single day in 2016. IoT and connected experiences are becoming a daily reality. As users quickly assimilate, the companies creating the future are embracing the potential.

At the center of the innovations — revealed and imagined — is mobile. Mobile is what will keep us all connected and what will connect us to the things we use every day. In the near future, not having a smartphone in the developed world may not be an option. Many telecommunication companies know this, which is why 5G and extended LTE considerations were significant parts of MWC 2016.

The Future of IoT

The future of IoT revolves around mobile.

IoT is already utilized in manufacturing and technology fields and is extending to everyday consumer use. We can now receive data about our home-energy use on our phones, track fitness through both our watches and our shoes, and remotely see what is in the fridge — all thanks to the integration of mobile and connected devices.

IoT is being utilized to connect and track data, while mobile is the platform on which we receive information and control our connected devices — resulting in the entire connected experience.

Just over 40 percent of households in developed countries have at least one connected device. In developed countries, the connected TV has almost reached 30 percent market adoption, with connected printers and cameras following, and connected cars expected to increase in demand in the coming years.

Unsurprisingly, the prediction is that connected-device sales will exceed the sales of smartphones in less than two years. That means that they will outdo the smartphone as the most successful consumer device to date by surpassing a user base of 2.2 billion and a yearly sales reach of over 1 billion, including upgrades.

To make IoT successful, there are a few things industry leaders must work on, including security. We are not far away from having every connectable aspect of our lives at our fingertips — 24/7. The proliferation of mobile devices and mobile app use is spurring this possibility forward.

How far we have come with connected experiences and IoT — with their connection to mobile — are just two of the exciting trends that have stuck with me since MWC 2016. Find out more about the Mobile World Congress — including trends in virtual reality and mobile interaction — in Part Two.

The post Mobile at the Center of Innovation — Mobile World Congress Trends: Part One appeared first on Digital Marketing Blog by Adobe.

Facebook Instant Articles and Adobe Analytics


Marketing Cloud

When Facebook and Google announced their intentions to create faster mobile experiences, we at Adobe applauded the move. When our publisher customers heard about Facebook’s plans many of them opted to join early adopter programs. Creating engaging content is what defines publishing, and the faster an article loads, the more users will engage with it. For publishers, engagement drives revenue, and any opportunity to increase the interest of an audience is an opportunity worth exploring.

These days, consumers have little patience; a recent study commissioned by Adobe showed that if digital content takes too long to load, seventy-eight percent of people switch or stop altogether; the difference between capturing a reader and losing them is determined in milliseconds. At the same time publishers want to understand how their readers engage and consume their content, so they can continue to improve the reading experience and deliver more personalized content across any screen and platform. That is our bread and butter.

Today, we are happy to announce that we’ve worked with Facebook to support Instant Articles in Adobe Analytics. Instructions for putting page tags into articles can be found in our documentation. We also announced support for Google’s Accelerated Mobile Pages (AMP) at their launch, and we have collaborated closely with Google. The best news is that in both cases, publishers have the option to reference the existing JavaScript tagging framework already in use, which simplifies the implementation process and avoids building analytics support from scratch.

Once Instant Articles are tagged, all of the rich information you expect from Adobe Analytics is available. For example, publishers can measure content velocity (articles viewed after this content), time spent per page and per visit, and percent of first time visitors. And because the Instant Articles framework leverages existing visitor identification methods, publishers can build all of the meaningful behavior-based segments they rely on, such as visitors who view more than five pages, visitors who spend more than five minutes with the content, or first time visitors who return within five hours.

Insight to Action for Publishers

Adobe Analytics is the award-winning analytics foundation of Adobe Marketing Cloud and the world’s largest enterprise marketing analytics solution, measuring nearly eight trillion server calls annually for over 150,000 marketers, business users and analysts including the world’s ten largest media companies and publishers. Forty-five percent of all server calls today come from mobile devices.

Adobe Analytics enables customers to create a holistic view of their business by turning customer interactions into actionable insights through advanced analysis capabilities and intuitive and interactive dashboards. It offers reports that users can sift, sort and share in real-time. A robust portfolio includes advanced predictive analytics and machine learning capabilities used by leading brands, including MTV Networks, NBC Universal, Discovery Communications, Comcast, AutoTrader, and thousands more.

Here’s an example of the types of information publishers can build in Analysis Workspace. I built this one in about 5 minutes.

Internal Image - Facebook Instant Articles and Adobe Analytics

What is Instant Articles?

If you’re new to the Instant Articles concept, think of Instant Articles as a version of HTML that causes content to load roughly 10 times faster than normal web pages, as reported in the Wall Street Journal. Instant Articles isn’t just about delivering a fast experience — it also extends the possibilities for interactive articles on mobile phones with new features like autoplay video, audio captions, and interactive maps using simple HTML5 tags. HTML5 provides an expressive format for specifying all the necessary information to lay out an Instant Article, and it allows publishers to reuse the code from their websites. This format also provides support for third-party content, including social media embeds, ads, and analytics — all of which can be embedded in Instant Articles just like they are on the web.

How much effort will this require?

From a practical perspective, many of you are wondering how much time you need to spend implementing measurement for Instant Articles. As Trevor Paulsen points out in his blog post, there are a few things to consider when choosing how to implement Google AMP. Since Facebook’s Instant Articles operate only within views on the Facebook app, the set of considerations is even tighter. It’s probably safe to compare your implementation of Facebook Instant Articles to tagging a redesigned section of your web site: there are some new elements to capture, and a few changes to how the code is set up, but overall, you are re-using most of what you have everywhere else. And if you’ve already begun using the iframe solution in Google AMP, it sets you up nicely for Facebook’s Instant Articles; Adobe Analytics is the only solution in the market to have a single setup for both. In short, the time requirement should be minimal, especially given the promised value of improved audience engagement.

The post Facebook Instant Articles and Adobe Analytics appeared first on Digital Marketing Blog by Adobe.

Automation Should Inform Personalization: What We’ve Learned from Adobe Target

Marketing Cloud

One area of confusion I’ve seen in personalization is around machine learning and automation. Marketers think they can plug-and-play algorithms on a personalization platform and automatically achieve strategic goals. The intent, though, is for automation to inform a personalization strategy, a matrix of manual rules and algorithms used to deliver the right content to the right person.

Combining Automation With Manual Testing
Effective ongoing personalization requires a process of testing and optimization. This process will tell you the right amount of personalization, where to use it, and whom to direct it toward. You start with a set of rules, and a solution like Adobe Target allows you to perform tests to validate the rules. Those rules can be delivered manually — where you’ve set up a targeting rule — or with an automated machine-learning logic that iteratively determines when to deliver a variation of content.

A test is built upon a hypothesis. You believe that, if you alter an existing experience or deliver a different content variation to an audience, you will improve their click rate, conversion, or any additional metric that is valuable to your organization. Our automated-personalization algorithms can drive deeper audience-segment analyses and faster insights while maximizing ROI by essentially saying, “Forget about the audience. I’m going to look at every individual’s profile data and utilize decision-tree analysis to determine what’s most predictive for delivering this content to that customer.” It’s essentially performing propensity modeling, saying these profile variables have a higher propensity for driving this person to do this behavior. It’s performing this analysis and providing real-time targeting and personalization to each individual but providing segmentation as an output of the activity. It is predicting who will be influenced by which variables, which can not only provide results that are the equivalent to hundreds of tests, but also help you refine both your personalization and testing strategies moving forward.

But the algorithms are only as smart as the data that’s fed into them. The first-party data you collect online — such as geo-location, device type, time of day and week, recency/frequency, referring site, category affinity, and other behavioral attributes — are only a start. If you can bring in additional data from a customer-relations management platform, such as in-store and call center interactions, and from third-party data sources, such as affiliation data and preferences from other sites, the data will be even better. The more data you bring in, the more accurate and intelligent it becomes.

Integrating this data can be challenging without the right solution in place. A personalization platform will have ways to incorporate this data through data-management platforms or core services that make it easy for a non-technical marketer to regularly schedule the importing of a CSV file. A best-of-breed platform will be integrated with many of these sources of data from the start.

Personalization in Practice
A major computer and technology retailer found that, with all the products and services they offer, there was a lot of bounce from people who were getting lost while navigating their website. So, they created adaptive personalized navigation to make it easier for customers to return to the areas or categories with which they previously interacted. If a customer added a laptop to his cart or purchased it, the retailer would email him links for chat and recommendations for additional amenities, which result in reduced call-center costs and a 10-percent increase in conversion. When the customer returned to the site, the retailer would welcome him back with the image of his laptop and those same additional links.

We work with many banks that are trying to increase customers’ abilities for self-service with mobile apps, reducing the reliance on ATMs or branch offices. It creates greater customer loyalty and satisfaction as well as reduces overhead. One large financial institution brings in its branch data with its call-center data and online data each night. When you come back to their site, and if you’re an investor, they have information on your current portfolio, and they’ve rated the next best offer for you. If you were inquiring about refinancing a mortgage, that banner pops up. If you were inquiring about a credit card, they’ll show you one you are eligible for rather than have you fill out a form for a card you may or may not be eligible for. They saw hundreds of percentage lift in click-through rate compared to a traditional rules-based approach when using this method.

Marketers want to be more effective at personalization. The fact that testing is a very manual process and takes a lot of time is a barrier. What we’re learning from Adobe Target is how to apply automation — not to replace marketers but to guide them in making decisions from rules and tests they run and to allow them to guide the next round of testing. It’s really a give-and-take process with the potential to drive significant business results.

The post Automation Should Inform Personalization: What We’ve Learned from Adobe Target appeared first on Digital Marketing Blog by Adobe.

Why Visuals Are a Must in Social Media

Marketing Cloud

In just a few years, social media has become coveted real estate for brands. It’s a crowded space where consumers spend an inordinate amount of time. Getting heard among the noise can be challenging, and brands looking to differentiate themselves not only need a compelling message, but also — and more importantly — visual content that resonates.

More than a random stock image, employing visual social media — and doing it well — is a deliberative, thoughtful process. Here are four tips to help you more effectively improve your social strategy with visual marketing.

1. Make it a Thoughtful Process

As stated above, choosing the perfect image is a careful, deliberative process. Explore how you want your audience to feel when looking at your imagery and think through the factors involved in creating a visual campaign that resonates with your audience. Some questions to ask when collecting images:

  • Who is my target audience?
  • What is the message that I am trying to get across to them through the image?
  • How can this image help them experience this feeling?
  • What are their needs?

As a brand, if you push something out to the consumer that’s not aligned with who you claim to be, they feel it. And they’re quick to recognize the separation between what was depicted and what was delivered. Keep it thoughtful and planned.

2. Forge a Consistent Visual Identity Across Platforms

When most people think of visual social media, Pinterest and Instagram come to mind. These social platforms paved the way for an important movement toward rich social-media images, but it’s important to extend your visual campaign beyond the basics to forge a consistent identity across platforms. Twitter, Facebook, LinkedIn, and others are also aligning to the idea of visual as a leading force. Video is on the rise, and brands have started using images to create short, engaging video clips as content for use on social channels. Keep your visual story consistent across platforms, and you’ll bolster your brand’s visual social identity.

3. Use Imagery — Not as a Supporting Asset but as the Message Itself

Great brands understand that the image is the message, particularly with how social platforms distribute visual content and how people consume it. Imagery is no longer a supporting asset but core messaging. Consider Apple’s visual campaign for the iPhone 6. The campaign focused on a series of images — photographs taken by an iPhone. The point of the campaign was to showcase the quality of the picture and nothing more. In this case, a single image was stronger than a brand name or pointing directly to the new features the phone provides.

4. Tie Visuals to Core Values and Mission

Influential brands use visual social media to personalize their companies. Visuals can provide a glimpse into company culture and personality, showcasing not only who you are as a brand, but also the persona of loyal fans and customers. Tying visual content to core values — and the sentiment that coincides with those values — makes it easy to tell your brand story. The key is to know your audience well and be clear about what you stand for and how you want to be perceived.

Conclusion

As the power of imagery becomes clear, smart brands are starting to shift their focuses by incorporating visual social content into larger marketing strategies. Becoming more visual is not without challenges, and it’s important to recognize the value of careful planning and a deliberative process. Executing a coordinated and consistent visual strategy is possible, but it requires staying true to who you are by selecting imagery that’s tied to core values.

Be sure to create a consistent visual identity across social platforms. Build a repository of images that adhere to a stylebook or set of guidelines that maps out how best to convey who you are and what you have to offer. Finally, avoid a disconnect by aligning visual messaging with the wants, needs, and expectations of your audience.

The post Why Visuals Are a Must in Social Media appeared first on Digital Marketing Blog by Adobe.

Personalization When Content and Data Are Lacking

Marketing Cloud

There is an interesting relationship between content and personalization. As I said in a previous post, content is king, but it needs a serious sidekick to deliver maximum value. Likewise, personalization systems are incredibly content- and data-hungry, constantly devouring every article, video, offer, and touchpoint you put in their paths. It is a powerful, ongoing, and value-rich cycle.

So, now the big question is “What do you do when you are lacking the content or the data — or even both — to deliver relevance at scale? What do you do when those rich, high-value foundational pieces are missing?

Understanding the Personalization Engine
A personalization engine is, essentially, a series of algorithms and modeling processes all tied to a targeting system. This system makes constant decisions to deliver specific content against specific criteria — in other words, visitors and the data they represent. Does this person qualify for this experience? Does he or she meet the criteria for this journey?

This decision making can be as simple or complex as you want it to be. It can be a binary decision: new visitors see this banner, while repeat visitors see something entirely different, for example. These processes work with just a few pieces of data and content and can even work with a limited audience. It is not resource heavy, because a human marketer can process the data and synthesize it fairly easily. Beyond that and, chances are, you will need an automated-personalization sidekick. Only a machine partner can make decisions quickly and granularly with a degree of statistical confidence — in other words, decisions you can take to the bank.

Where the Bottleneck Lies
Given how much data and content a personalization platform needs, delivering relevance can be a real bottleneck for organizations. How can a marketer tap into the power of personalization, especially when content, data, and/or traffic is lacking? And from there, at what point should a human drive the ship, and when should a machine take over?

The first thing to consider is whether your decision making is simple enough for a human marketer to tackle or so complex it needs a machine. Is the decision based on simple rules like, “always show people from San Francisco this welcome hero,” or more complex (but still manageable) rules like, “always show this hero to men from San Francisco who shop on Saturdays during the winter and who have purchased before”? Where is the line drawn when rules upon rules upon rules add so many layers that you need help with the heavy lifting? Wherever that line falls, Adobe Target can be deployed to do that vital modeling and decision making; but even then, the problem is not solved. Why? Because we are right back to the content conundrum.

My colleague, Loni Stark, talks often about the notion of content velocity and how it is essential that you continue delivering the right images, offers, and other pieces quickly and seamlessly, while keeping pace with the mounting demands of increasingly hungry personalization platforms. It is not just a consideration; it is a real hurdle for data-driven marketers, as is the flipside of this: being smarter and more strategic with the content you have. Figuring out how to do more, more, more without more, more, more is not an easy thing to do. However, that is where data science comes in, helping you to put the right content in front of the right people, ensuring greater value for both the consumer experiencing it and for your organization. Once personalization is part of the conversation, it is possible you will not need quite as much content. Maybe now, you do not have to show 20 different scenarios or hero images, but you still need content that is going to work hard and resonate in a big way.

Where Your Organization Falls
All of that said, many organizations are not quite here yet — in a place where content velocity, machine learning, and complex personalization algorithms come into play. Many companies are still trying to create recommendation experiences, for example. It takes most businesses time to successfully clear these specific personalization hurdles.

And here is the thing: simple personalization is great, too. You do not need to boil the ocean right away. Instead, step by step, grow your optimization and personalization strategy, gaining more traction and more wins along the way. That is one of the great things about personalization: even a little moves the needle. Understand the content you can deliver — X offers in Y timeframe, for instance — and dig in to the data you have. Try delivering a different experience for tablet users versus desktop users, or ensuring that mobile visitors always get a geo-specific message. See whether the tests you are rolling out help you achieve business goals. Determine what is working and what is not.

At Adobe, we have been trying to simplify content creation for all levels of personalization processes. By leveraging the power of our Visual Experience Composer — part of Adobe Target — marketers can be more self-sufficient and even do more with less. Instead of creating more content by constantly returning to your designers or creative teams and saying, “I just need one more ad that says ‘vacation’ instead of ‘holiday’,” marketers can tackle many of these experiences themselves. It allows you to be extremely nimble, changing pieces of the consumer experience with just a few quick clicks.

And again, if that is all you are doing — swapping out ‘holiday’ for ‘vacation’ or delivering this experience for new visitors versus that experience for returning ones — that is a good first step. If, to any degree, you are using data to personalize experiences, chances are that you are going to do better than you would without delivering those varying touchpoints. Personalization is about fine-tuning experiences, but it also requires significant data and content to keep feeding the platform. Understand what you have, and what you can realistically produce, and then chart a course from there. Even if content and data are lacking, you can still deliver some degree of personalization and then refine more over time. Some is better than none, though! So, do not let a lack of content or data hold you back from dipping your toe in the personalization waters.

The post Personalization When Content and Data Are Lacking appeared first on Digital Marketing Blog by Adobe.

Data and Content: The Kimye of Digital Marketing

Marketing Cloud

Kanye West and Kim Kardashian (collectively known as “Kimye”) have become one of the most scrutinized, celebrated, and vilified couples in recent memory. What is it about them that makes their relationship so captivating?

In digital marketing, there’s another viral couple evolving: data and content. (What should we call them? DaCon? ConDa?) Studying this complicated relationship provides unique insight into how data affects content and vice versa. Because, just like Kimye, the more you see of one, the more you inevitably see of the other.

The Relationship Between Data and Content

Data and content lay the foundation for delivering relevant, personalized experiences. According to International Data Corporation (IDC), 76 percent of marketers claim personalization is driving an increased need for assets. And today — with the explosion of connected devices, channels, and Internet of Things — there’s more data than ever before. Analysts predict that 2016 will be the year of data-driven content marketing. So, what is this relationship all about?

It’s a two-way street. First, data provides context regarding the content you serve. In addition to digital-asset management and content-distribution tools, marketers need robust analytics to understand customers and serve targeted, one-to-one content. Data is the bedrock that allows marketers to measure, strategize, optimize, and ultimately, personalize.

Conversely, content drives the data that surfaces from campaigns. Content-engagement metrics enable us to gather data that can separate noise from important conversion behaviors.

We used to provide specific content for specific phases of the funnel, delivered in a linear progression. Now, we think of the customer journey as being nonlinear. Data-driven marketing allows us to move beyond a touchpoint-based approach and focus on a people-based approach, which allows us to target audiences and serve content based on so much more than a customer’s pipeline stage. To do this, content must be flexible but engaging, and most importantly, relevant to each visitor’s context.

How Data Influences Content

Content becomes a better fit when you bake in behavioral and contextual data as key ingredients, along with additional first-, second-, and third-party data. With a more sophisticated view of the audience, you can communicate with more compelling and engaging content. To do this successfully, content-engagement behaviors are fed into analytics to help target audiences, and ultimately, shape future content. It’s important to build the “voice of the customer” into this, gathering data and feedback from call centers and surveys to build out a rounded approach to delivering optimal experiences.

A key piece of being able to deliver personalized experiences is being able to recognize that person — especially when they are reaching your brand from different devices. Recently, Adobe announced the Adobe Marketing Cloud Device Co-op, a network that will enable the world’s biggest brands to work together to better identify consumers as they move from one digital device to another — all while adhering to the highest standards of privacy and transparency. This co-op will empower participating brands to deliver more personalized consumer experiences across devices and applications at massive scale. Ensuring that the content responds and adapts to the screen, as well as the channel from which it’s being accessed, is critical to providing a good user experience.

How Data Guides Personalized Content

The Kimye-style relationship between content and data becomes even more interesting when you talk about automated personalization. It combines the creative aspect of human intelligence with the analytical power of data science to simplify the process of extracting insights from billions of data points. Marketers can use these insights to make better business decisions, focus on creating standout content, and benefit from recommendations and predictions they didn’t even know existed. For example, the new “lifetime value decision” feature in Adobe Target helps marketers predict a path of conversions that leads to the highest profit from a customer over time. It analyzes past customers’ interactions, as well as their in-session behaviors, and recommends new product, content, or service offers in real-time.

Reaching Content Velocity (Be Everywhere. All the Time.)

According to IDC, 85 percent of marketers say that they’re under pressure to create assets and deliver campaigns more quickly. And 71 percent of marketing professionals say they need to create 10 times as many assets to support customer-facing channels. As you probably know, consumers’ expectations for relevant content have risen dramatically — and the time they have to engage is shrinking. Let’s face it, we all demand richer, more personalized experiences across all of our devices and at every touchpoint. This means every brand is under pressure to increase the quality and quantity of content used to deliver customer experiences. And they need the tools to support this demand by getting more content to market — and faster.

Adobe Experience Manager (AEM) enables businesses to manage large volumes of creative and marketing assets that empower their teams to deliver amazing customer experiences across any channel. AEM’s new feature, Content Fragments, enables authors to create content centrally and tweak it for different screens and channels, including social, in a fluid and automated flow.

True North

Now, we understand the codependent relationship between data and content. In what direction can you go in terms of data-driven content marketing? Whether you’re analyzing data types using solid extraction, organization, and analysis techniques; or you’re emphasizing the delivery of targeted, relevant content within a user’s context and in real-time; data drives content, which drives more data, and on and on it goes. No matter what, Adobe has the solutions to guide your path.

The post Data and Content: The Kimye of Digital Marketing appeared first on Digital Marketing Blog by Adobe.

Website Conversions and Conversion-Rate Optimization: Every Brand’s Secret for Success

Marketing Cloud

If you don’t operate a traditional B2C business, it may seem as though you don’t need to worry about website conversions or conversion-rate optimization.

However, conversions are important for every business — no matter what your business model, service, or product is.

Every action a user takes on your site has an impact on macro and micro conversions — whether downloading an app, liking a Facebook page, or signing up for a newsletter. These conversions connect prospects and customers more closely to your brand and move them through the sales funnel. Typically, when people talk about conversions, they are referring to sales. However, brands should think of conversions as any time a user is successfully convinced to take action with your brand. And improving and optimizing those actions can provide an incremental and cumulative impact on revenue.

Optimization Through Testing

Conversion-rate optimization should be an ongoing, iterative process; both content and customer behavior/preferences are constantly changing due to new initiatives, campaigns, and trends. Testing allows you to refine your messaging, imagery, design, and audience for each campaign. Testing is something that should be accessible to every team in your organization, as they are all experts and can make educated contributions in their particular fields. Testing should be a democratized process, not something that only a select few in the organization manage.

To properly conduct a conversion-rate optimization test, you do not have to be an IT (information technology) professional or a statistician, but you do need to have a strategy. It’s best to use both a quantitative assessment (analytics reports, anomaly detection that highlights bounce in a particular location) as well as qualitative analyses (customer focus groups and suggestions from product/channel specialists). Both boil down to determining a list of problems you would like to resolve.

Running tests certainly takes time and resources, which means that a cost-benefit analysis is very beneficial to determine the prioritization of testing activities and to constantly re-evaluate that prioritization based on results. The higher the velocity and quality of conversion-rate optimization that you sustain, the greater the returns in terms of revenue and data-driven insights.

What is really amazing is that, throughout the testing process, you receive real, hard data from reporting. You can also obtain additional drill-down and context provided by integrated analytics tools, enabling you to improve your conversion rates by providing actionable, real-time information. This is the motivation behind optimization solutions that provide integration with an analytics tool: broader contextual qualification of what you are seeing in the results and assurance that no audience is overlooked within a broader testing population.

When coupled with analytics, testing provides your brand with myriad ways to increase the number of positive customer experiences — such as providing a more efficient path-to-purchase or easier access to answers for frequently asked questions — which results in increased conversions and greater customer lifetime value. Machine-learning algorithms and automated personalization can also be very powerful in real-time performance evaluation, while maximizing relevancy and propensity for single or multiple conversions.

Optimization Through Personalization

Once you have a better understanding of what works for each audience type, you can begin optimizing conversions through personalization. Personalization allows you to present each audience type with the product or offer that they are most likely to act upon. Using data to understand your audience segments’ preferences — however broad or niche they may be — will help you to improve conversions overall.

For instance, if a father of a newborn baby hits your landing page where you are promoting your highest-selling item — a breast pump, let’s say — this may not be the most relevant offer for this audience segment. Instead, if you personalize your offer based on the most predictive audience that this user falls into, you will have a higher likelihood for conversion. In this particular example, your testing may have revealed that fathers of newborns, clicking from a search for “things that fathers should know about newborns,” are most likely to download an app that helps them remember when they last fed, changed, or washed their babies. Therefore, you could personalize your landing page to prompt them to download your app instead of attempting to sell them something they are unlikely to buy.

CRO: Every Brand’s Secret for Success

Ultimately, there is no website that is above conversion-rate optimization. Even power users in this space realize that it is a constantly changing landscape. Apply a critical eye to your own website; look at each element of an important page and ask questions related to strategic-optimization methodology: does content belong here? What type of content? What is the most impactful design, look, and feel? Should there be messaging; should it be personalized? There is always room for improvement, just as there is always room to potentially refine your message and call to action. Conversion-rate optimization can drive deeper customer engagement; improve customer consumption, loyalty, and brand advocacy; and increase overall revenue over time. These are things all brands should be striving to improve upon — and now, you can also through testing and personalization.

The post Website Conversions and Conversion-Rate Optimization: Every Brand’s Secret for Success appeared first on Digital Marketing Blog by Adobe.

And The Winner of The Masters is… Apple TV and Roku


Marketing Cloud

Augusta, national home to The Masters, is one of the most iconic and hollowed sports grounds in the world. With no Tiger Woods in the tournament this year, will it be Jordan Spieth, Rickie Fowler, or Phil Mickelson that steals the show and hearts of millions across the world? I’m not sure which player will win, but in a brave new world of live streaming look no further than Apple TV and Roku to take over The Masters this year.

In 2015 I showed that iPhone and iPad would be my device of choice for watching The Masters. This year however I, and many others will be looking for Apple TV and Roku to lead the charge. According to a report released by Adobe Digital Index TV connected device viewing of TV everywhere content, including The Masters, is up 31% year-over-year.

JM4.6.16

For years we as consumers have been enamored with mobile devices for video viewing, but the recent growth in connected and smart TVs is leading a shift back to TVs as our device of choice for sports. With connected TVs you can view the 4 day Masters event from start to finish without interruption, or you can view only what the network wants to show you. It provides the option to follow certain players, watch certain holes, or just follow the normal broadcast. You can also choose to airplay from your phone to the connected TV onto your television for the content that is available for free.

Sports has long been the last egg to drop in a world demanding ala carte programming, but the Super Bowl, March Madness and other sports are now all live streaming some of their product for free without authentication. This is great for chord cutting consumers who hunger for access to the major sporting events and also for content creators who can get access to the growing availability of digital advertising. For advertisers, a digital world provides an opportunity to target and create calls to action for consumers who have long been fast forwarding through their linear ads anyway.

Ill be watching interacting with The Masters app on most of my devices this week, but I’ll be looking to watch Jordan Spieth, Rickie Fowler, or Phil Mickelson to put on the green jacket streamed through my Apple TV on my big screen at home.

The post And The Winner of The Masters is… Apple TV and Roku appeared first on Digital Marketing Blog by Adobe.