So it’s that time of year again when commercialism runs rampant, people spend with reckless abandon, and at any moment there could be fisticuffs at your local Wal-Mart. But alas, this is Holiday Season in America, so be joyous about it!
I’ve been watching online spending trends for the past decade and most recently tying to discern what impact mobile and social media plays in all that glitters online. All signs indicate that 2013 is door-busting records with all time highs for online sales, yet depending on which data you believe in, there’s different stories to be told.
Two analytics leaders, IBM and Adobe routinely benchmark holiday shopping. And while their methodologies differ, so too does their data. Here’s a snapshot of some of their published findings thus far:
Show me the Money
IBM’s Digital Analytics Benchmark reports a +18.9% increase from 2012 in Black Friday sales during this year’s holiday season. Average Order Value (AOV) was $135 with on average 3.8 items per order.
Adobe’s Digital Index reported slightly higher profits with a 39% increase from 2012 for a whopping $1.93 Billion in online sales. Adobe reported a similar AOV at $139 and also revealed that the peak shopping time on Black Friday was between 11AM and noon ET, when retailers accrued $150 Million during this single profitable hour.
While both companies reported lift on 2013 online sales during these two days of shopping, each indicates substantial lift in Thanksgiving Day sales, which may have cannibalized some of Friday’s profits. And while Cyber Monday numbers are still being tallied, all signs point to the biggest online shopping day yet, which likely has retailers grinning from ear to ear early on in this short 2013 holiday shopping season.
Both indices show mobile as a significant driver in online sales. Adobe reported that on Thanksgiving and Black Friday, nearly one out of every four sales was made via mobile device. IOS devices and in particular, iPads were the device of choice in both company’s findings. Adobe reported that a total of $417 Million was recognized in just two days (Thanksgiving and Black Friday) via iPad sales by businesses within their index.
This should come as no surprise to those of us following the data, but mobile now represents nearly 40% of all Black Friday traffic. That’s a trend that retailers just cannot ignore. And as a consumer, you probably can’t ignore it either. Tactics reported by IBM indicate that retailers sent 37% more push notifications via alerts and popup messages on installed apps during these two heavy online shopping days.
Where in the World?
The biggest discrepancy between the two online shopping benchmarks comes from the geographic perspective. Keep in mind here, that IBM’s Digital Analytics Benchmark is comprised of data from 800 US Retail websites; and the Adobe Digital Index data represents a wholly different set of US retailers that accrued 3 billion online visits during the Thanksgiving to Cyber Monday shopping spree. (Note that exact comparable data isn’t provided in publicly available information.)
Yet, Adobe’s data reflects the majority of online shopping on Black Friday coming from 1) Vermont, 2) Wyoming, 3) South Dakota, 4) North Dakota, and 5) Alaska. They cite weather and rural locations as rationale for these states topping the list. IBM on the other hand, indicates that on Black Friday 2013, the highest spending states from their benchmark include: 1) New York, California, Texas, Florida, and Georgia. It’s not atypical to see variances in data sets, yet keep in mind when interpreting results for yourself, it’s all about the data collection method. Results will vary based on who is in your benchmark and how you’re slicing the data.
While IBM’s early data cited in an article by All Things Digital made the outlook for social appear dreary,
Adobe weighed in with a contradictory and uplifting perspective on social. IBM did not report on social sales for Black Friday in 2013 apparently because the findings weren’t “interesting”, but their report from 2012 showed that directly attributable revenue from social media (last click) was a dismal .34% of Black Friday sales. By my math that equates to a paltry $3.5 Million total online dollars via social media sales for Black Friday. The AllThingsD reporter managed to eek out of Jay Henderson, IBM’s Strategy Director, that social sales were flat again this year. Moreover, the article quotes Henderson as saying “I don’t think the implication is that social isn’t important, but so far it hasn’t proven effective to driving traffic to the site or directly causing people to convert.” Hmm…
However, this year Adobe is telling a slightly different story. According to their Cyber Monday blog post, social media has referred a whopping $150 million in sales in just five days from Thanksgiving to Cyber Monday. While, it’s not clear if they’re tracking using a last- or first-click perspective, this data indicates that social is pulling its share of the holiday sled this 2013 season. Well, at least social is pulling about 2% of the sled based on a total of $7.4 billion in total online sales from Thanksgiving through Cyber Monday.
Whichever metrics you choose to believe, counting dollars in social media ROI is never an easy task and it usually doesn’t lead to riches. I’m about to publish a white paper on this very topic, so if you’d like to learn more about quantifying the impact of social, email me for more info.
The Bottom Line
This holiday season is shaping up to be the biggest yet for retailers of all sizes. Remember when just a few years ago people were afraid to buy ***anything*** online? Well, it certainly appears that those days are gone. So, as the days before Christmas (or whichever holiday you celebrate) wind down, and the free shipping deals get sweeter, and the door-busters swing closed until next year, take a close look at your data to see what the digital data trends leave for you.
Google recently launched a new television commercial that advertised their Chrome browser as a solution for your computer, tablet, and mobile device. For marketers and digital analytics pros of all types, this solution has real potential. Not because of the convenience of the solution, but because it potentially solves our problem of identifying visitors to our websites and mobile apps as they traverse from work computer, to mobile to tablet…throughout the day.
First check out the video:
Here’s why this solution has potential for consumers…
Errr…what’s my password again? In an increasingly password-protected web, users will find this unified browsing service valuable. How many times have you scratched your head and asked yourself…”What’s my password?” This unified browser resolves that issue with Chrome’s saved password feature. For those of you not using OS keychains or another solution for recalling your passwords, this is a sure-fire way to minimize the dreaded password reset.
Faster than a speeding search engine. Google’s search (while Bing is giving it a good run) is getting smarter. The Chrome “Omnibox” (you know it…it’s the address bar) will automatically predict what you’re typing (if you let it), which virtually tells you that Google is smarter than you are. Not only does this help get to the right stuff more quickly, but it also recalls where you’ve been previously. But if you’re not into that sort of thing, “Google only records a random two percent of this information received from all users and the information is anonymized within 24 hours. However, if you use Chrome Instant, your data can be kept up to two weeks before it’s deleted.”
Remember my Tabs? No, I’m not talking about the “Totally Artificial Beverage” soft drink (for those of us old enough to remember Tab cola), which was the predecessor to today’s ubiquitous Diet Coke. I’m talking about the tabbed browsing experience. Since most of us bounce between devices as a matter of habit, the ability to bookmark a tab on one device and pick up another to find the same page is becoming increasingly valuable. No more searching for that web page you found right before your boss walked into your cubicle. Simply tap the bookmark star and you’ve got it remembered on all of your Chrome-synched devices.
Here’s why this is a web analysts’ dream…
For us web analytics wonks, having Google Chrome Now Everywhere could help us solve the problem of identifying visitors across devices and sessions when they don’t log in. I cannot count the number of conferences, expert panels, and lobby bar conversations where I’ve heard the question asked: “How can we identify anonymous users across devices?” Well, Google could now potentially solve this problem for a subset of devoted Chrome users…if they choose to make this data available. That’s a big if…
Despite the fact that Google also announced Universal Analytics today, Google would have to make this cross-device data available to us #measure folks. Wouldn’t that be AWESOME? But who knows if they’ll open the kimono on this really valuable data? Perhaps, Google may be holistically trying to help marketers by someday tying products like Chrome and Google Analytics into a common perspective… But perhaps that’s just too progressive for the privacy pundits. I don’t know.
While no digital analytics solution is 100% accurate in its ability to understand user behaviors due to cookie deletion rates, missing data, and anonymous browsing. Chrome’s omni-device presence would certainly help identify with precision those users who opt in to use this solution because of the benefits that it offers. I’ve been saying this for years, but it’s all about the value exchange. And the value derived from having Chrome remember all of your passwords, favorite pages, and preferences is well worth it for many. Don’t be surprised if Safari, Firefox and others start riding GOOG’s coat tails on this one…
What about you? Do you think this will change #measure?
Attributing credit across a multitude of marketing efforts is one of those sticky problems in digital analytics that seems to generate a whole lot of controversy. This is a topic that comes up with nearly all of my clients and is one that both Eric T. Peterson and I have been researching and writing about for some time now. My latest findings on attribution will be published in a whitepaper sponsored by Teradata Aster, titled, Attribution Methods and Models: A Marketer’s Framework, but you can tune in to our webcast on January 16th, to get the high notes.
While some pundits will argue that attribution is not worth the trouble and that all attribution models are flawed, others contend that attribution simply requires a healthy dose of marketing science, which will enable marketer’s to reap benefits tenfold. At the risk of opening up a whole can of Marketing Attribution worms, I’ll offer my Marketer’s Framework for Attribution, which is a pragmatic approach to organizing, analyzing, and optimizing your marketing mix using data. But first, let’s define marketing attribution:
Web Analytics Demystified defines Marketing Attribution as:
The process of quantifying the impact of multiple marketing exposures and touchpoints preceding a desired outcome.
The first question that you need to ask yourself is whether or not you really even need to include attribution in your analytical mix of tools, tricks, and technologies. I offer this as a starting point because attribution isn’t easy and if you don’t really need it, then you can save yourself a whole lot of headaches by short-cutting the process and offering a data-informed validation of why you don’t want to mess with attribution.
The approach I offer is shamelessly ripped-off from Derek Tangren of Adobe, who blogged; Do we really need an advanced attribution marketing model? Derek encourages his readers to answer this question by looking at their existing data to determine what percentage of orders occur on a user’s first visit to your website vs. those that occur on multiple visits. I bastardized Derek’s idea and applied it to help marketers understand how many visits typically precede a conversion event. While Derek offers a way to do this using Adobe Omniture, I’ve created a custom report within Google Analytics that does virtually the same thing. I call it the Attribution Litmus Test.
My version is a quick sanity check for those of you running Google Analytics to determine the number of conversions that occur on the first visit versus those that occur on subsequent visits. To use this, you must have your conversion events tagged as Goals within Google Analytics (which you should be doing anyway!). If you’d like to run the Attribution Litmus Test on your own data within Google Analytics, you can add the Custom Report to your GA account by following this link: http://bit.ly/Attribution_litmus_test. Remember that you must have goals set up in Google Analytics for this report to generate properly.
So now that you’ve determined that Attribution is a worthwhile endeavor to pursue for your organization, let’s dive into the Framework. According to a study conducted by eConsultancy, only 19% of Marketers have a framework for analyzing the customer journey across online and offline touch points. Yet, the reality of consumer behavior today illustrates that multi-channel marketing exposures and multiple digital touch points are commonplace. As such, Marketers need a method for understanding their cross-channel customers in a systematic and reproducible way.
Step 1: Identify Your Data Sources
The first step in utilizing an Attribution Framework is to identify and input your data sources. Because advanced attribution requires understanding marketing effectiveness across all channels, it means that you must acquire data from each channel that potentially impacts the customer path to purchase. Typical digital channels may include: display advertising, search, email, affiliates, social media, and website activity.
Step 2: Sequence Your Time Frame
All attribution models must consider time to understand which marketing exposures occurred first, and also to discern the latent impact of exposure across channels. This requires that organizations sequence their data. While numerous data formats will likely go into the model, we’ve seen the greatest success when attribution data is stored and aggregated within a relational database.
Step 3: Apply Attribution Models
The actual attribution models will determine how you look at your data and make determinations about which marketing channels, campaigns, and touch points are effective in the context of your entire marketing mix. There are five models that are commonly used in the attribution world: First Click, Last Click, Uniform, Weighted, Exponential. To learn more about these models, tune into the webcast where I explain each in more detail.
Step 4: Conduct Statistical Analysis
After the data has been prepped, sequenced, and cleansed; this is typically where Data Scientists conduct general queries, apply business logic, and run what-if analyses against the model. At agencies that specialize in attribution modeling like Razorfish, they have an advanced analytics team comprised of data scientists that attack the data. They’re looking for correlations to identify if users are exposed to marketing assets A>B>C, are they likely to take action D?
Step 5: Optimize Marketing Mix
Of course, the ultimate goal in utilizing an attribution framework is to make decisions that impact your marketing efforts. These decisions can be strategic such as: deciding to invest in a new social media channel; discontinuing use of a non-performing affiliate partner; or reallocating budget to highly successful channels. But an attribution model can also play a major role in making daily life marketing decisions such as: which keywords to bid on during a specific campaign; who should receive an email promotion; or where to place that out of home billboard to attract the most attention.
In conclusion, Marketing Attribution continues to be an Achilles’ heel to many marketers. But, the good news is that approaching attribution with the right toolset and a framework for solving the attribution riddle is definitely the way to go. Throughout my latest research, I talked with companies like Barnes & Noble, LinkedIn, and the Gilt Groupe to learn how they’re using and applying Marketing Attribution models. I’ve also had the good fortune to demo some of the latest attribution tools from industry leading vendors like Teradata Aster and Visual IQ. Through this research, I learned that there is some truly innovative work going on with regard to attribution, but there is no single best way to do it. I’d love to hear how you’re solving for attribution. Please shoot me a note, tune into our webcast, or comment on how you’re re-examining attribution.
I’ve been spending a lot of time recently working with data. For some clients I’m helping to assemble data from multiple sources across their enterprise to answer business questions like how does clickstream behavior impact revenue. For other clients, I’m strategizing about using aggregate data to create new opportunities that provide added insights and actionable steps toward increasing profitability. And for fun, I’m slicing through data to gain greater understanding of events I’ve missed or simply things that I’m curious about.
This last effort is what got me typing today. As I sorted through Tweets and scoured the web for information about the recent DAA Symposium in San Francisco, I was heads down looking at data. I wanted to accomplish two very specific objectives: 1) to validate a new calculated metric that I’m working on, and 2) to simply find out how the event was and what type of knowledge was being shared.
So I turned to five different tools to try to find the answers that would satisfy my curiosity.
My research quickly yielded data that showed how many Tweets with @DAAorg and #SanFranDAA were flying; who the top contributors were; and in some cases how many impressions were created by these messages across the Web. As I researched more, I became more and more focused on the numbers and sought to find the story within the data that would tell me more. As I dug deeper, my tracking spreadsheet started to grow and I began to see that across the five tools, each had significant gaps in the data that they provided. While most were able to reveal the total volume of mentions for my specific keywords, there was a great deal of variation in what they found. Further, the data produced by these tools was often lacking metrics that I wanted to perform my calculations. But what really struck me was the fact that amid all this data I was looking at, very few of these tools told me anything about the content of what was being said. Sure, I could scroll through the individual Tweets and see the content, there were also lists of top keywords showing me what was mentioned most, and even in a few cases there were word clouds that highlighted commonly mentioned terms and their relationship to my search query. But through all of this data I still didn’t know what really happened at the DAA Symposium in San Francisco. I needed someone who was there to fill in this essential piece of information.
But I was still determined to produce something from my exercise in curiosity, so I sent out a Tweet with a quantitative perspective on what I had discovered. Almost immediately, I received a response that asked… “@johnlovett @DAAorg so what’s the qualitative story?” I too had this question in my mind and with the help of this one innocuous Tweet; I realized that every data exercise can benefit from both the quantitative and qualitative sides of the story. Either one alone is woefully insufficient. By digging into the data, there were things that I could see that helped me to understand what happened at the event, and I was even able to gain a better understanding of the awareness created by the event using my calculated metric. However, what I failed to capture in looking solely at the data alone was the qualitative message. Through all the Tweets and data I analyzed, I learned some very interesting things, but the results of my analysis were hollow without a first hand narrative to accompany them.
While this may be painfully obvious to many, all too often I see organizations lose sight of this fact. They expect digital analysts to amass data and crunch numbers to uncover revelations about the business. But in many cases, these analysts don’t have the benefit of understanding the strategy behind the numbers or the context of a story that they data can support. This makes their jobs incredibly more difficult and ultimately it leaves their analysis with a hollow void that is begging for a narrative. In my experience, I’ve found that this narrative comes from collaboration between analysts and business stakeholders who take both sides (the quantitative and the qualitative) to showcase results in a manner that is not only meaningful, but also leaves a lasting impression.
So the next time you’re itching to deliver that beautiful analysis you just created…or if you’re listening to an eager analyst share new data…ask yourself if the perspective you’re hearing considers both the quantitative and qualitative sides of the story. If not, ask for more.
A new report debuted last week on CEO.com from the creators of DOMO, which citied findings about the social participation of Fortune 500 CEO’s. The report showcased the fact that only 7.6% of big company CEO’s are on Facebook; only 1.8% actually use Twitter; and that 70% of global CEO’s have no social media presence at all. To these numbers, I say…FANTASTIC!
Now, don’t get me wrong…I’m a huge proponent of social media and of measuring it methodically…I even wrote a book on this topic. Further, I corroborate the statements that social media has become a transformative force that’s changed the way individuals and businesses communicate. Of course, without a doubt! Yet, when CEO’s are called to task for not individually participating in social channels…well I for one think that they should be spending their time focusing on fiscal responsibility, shareholder value, and customer satisfaction with their products and services. These CEO’s should be lauded for focusing on what matters and for delegating a social presence to others within their organizations who are hired to interact with consumers and to keep touch with the pulse of their marketplace.
While this report certainly doesn’t shed light on what CEO’s actually do spend their days doing, it proves that they aren’t looking to social media as an output channel. And thank goodness for that. While social media is undeniably valuable for communicating to consumers, marketing to them, and interacting in meaningful ways…last time I checked, that’s not the job of an officer in chief. Do they need to be aware of it…? ABSOLUTELY! Do they need to be open to consumer and employee interactions? Why Yes! But do they need to be a first-line responder? I think not. There are lots of ways for executives to stay informed and to communicate. Yet, bolstering a social media presence only to abandon it shortly thereafter, or allow it to die a slow unused death doesn’t help anyone’s credibility.
Maybe I’m alone, but in my opinion the underlying premise of this research missed the mark by a long-shot. Fortune 100 CEO’s shouldn’t be pandering to consumers on social media. Let’s allow the executives in chief the opportunity to focus on business and save the Twittering and Facebooking for the marketers.
Before too much time passes during these dog days of summer, I thought that I’d offer a recap of the eMetrics Marketing Optimization Summit that took place in Chicago recently. First of all, Chicago really digs analytics. Despite a smallish eMetrics crowd of around ~100 or so people, there was lots of energy, young talent and academic interest.
I had the privilege of sharing a few minutes of the opening keynote with Jim Sterne where I made a few announcements about the newly rebranded DAA (Digital Analytics Association). I proudly announced that we transitioned 25% of our Board of Directors by adding new members Eric Feinberg, Peter Fader and Terry Cohen to our diverse assembly of directors. I also took the stage in my new role as President of the DAA and shared my thoughts about the epic journey we’ve collectively embarked on in this industry that we call digital analytics. This is a theme that I reiterated during my closing presentation on The Evolution of Analytics, whereby I concluded, that the future state of evolution is up to each of us to determine.
But speaking of future success, I commend the local DAA Chicago Chapter for the great strides they’ve made in not only organizing our open industry meeting, but also in championing the cause for digital analytics in the windy city. The DAA has much better brand recognition and awareness in Chicago than I thought. But I suppose I shouldn’t be too surprised because after all, according to the DAA Compensation scan, Chicago is the second best place to live if your seeking a job in analytics.
Moving onto more details about the conference, Jim Sterne always encourages attendees to measure the value of eMetrics not just in the content, but also in the hallway conversations and the key tibits that you take back to your desk when all the sessions and lobby bar fun is over. In Chicago, for me the hallway conversations focused on several hot topics in analytics including: tag management, privacy and of course, the perennial analytics issues of people, process and technology.
I also learned (privately) that Amazon is doing some crazy brilliant stuff, but it’s so good that they can’t even talk about it. The senior brass at the really good companies are very protective, but web analysts can still be plied (at least a little) with alcohol at a Web Analytics Wednesday.
And finally, people who do know what we do are struggling to pull together the pieces for making an analytics program work…finding staff, selecting tools, building process. These are perennial issues in digital analytics and why we’ve built our consulting practice here at Web Analytics Demystified to help solve these problems.
But as always at eMetrics, I was invigorated to speak with new entrants to digital analytics and the usual suspects as well. For me, I’ll be taking from this eMetrics something back to my desk and to my clients…and that is a fresh perspective.
Anyone who has been in this game for any length of time should recognize that it’s easy to become steeped in your own myopic view of digital analytics and continue to rehash the same perennial issues with the same examples over and over again. Yet, any good analysis – or method of teaching – needs to evolve to remain relevant. And thus, for me this eMetrics taught me that experience needs to be tempered with the fresh eyes of unbridled passion and enthusiasm. While we may hold the frameworks and fundamentals, it is they who hold the spark. I for one appreciate what the next generation of digital analyst is bringing to this industry and hope to learn as much from them as I can offer.
I’m on the plane returning home from the second ever Web Analytics Demystified ACCELERATE Conference and I can’t help but smile as I think about what an incredible event this was. For starters, demand for this event maxed out the ~200 person capacity of our Chicago venue at the Gleacher Center, but we managed to comfortably squeeze in all of our registered guests as well as everyone who showed up on the waiting list into the room. Of course, Chicago was well represented but there was also a preponderance of Ohio Analysts in the house as well. The OHiO solidarity was reiterated with incessant demands for a Columbus, ACCELERATE sometime in the not too distant future…to which we say, Anything’s possible
Once we kicked off, the room was electrified by Eric Peterson’s inspiring opening comments and you could definitely feel the energy in the air. We promised our attendees a fire hose of content and delivered by honing our “10 Tips in 20 Minutes” format to keep things going at a frenetic but well managed pace. Based on comments and feedback we received, I think it’s safe to say that anyone who was there will tell you that we over-delivered. You can check out the recent Tweets on #ACCELERATE yourself, but I’ll offer up a few notable comments:
medmonds: Very impressed with the #ACCELERATE conference – insightful tips & strategies for optimizing digital channels from industry leaders #MEASURE
Jonghee: Completely satisfied with #ACCELERATE. It’s quality is better than some of the expensive ones. Great job @erictpeterson and the team!
Ableds2: Few industries/professions strive for excellence like this group. I am honored to be surrounded by amazing people #ACCELERATE #measure
#ACCELERATE by the Numbers (April 4, 2012)
One of my responsibilities during ACCELERATE, beyond delivering my 10 Tips on Using a Social Media Measurement Framework was to track the Twitter stream to see what was coming in throughout the day of the conference and who the BIG Tweeters were. I thank TweetReach for providing access to their monitoring tool, which allowed me to conduct my analysis in near-real time as Tweets tagged with #ACCELERATE were flying across the Interwebs.
***Note: My TweetReach Tracker is set up for East Coast time, so this reflects a -1hr Time Zone delay.***
Exposure: (measured in Top Contributors by impressions) We did a pretty good job overall of sharing the love emanating from ACCELERATE on Twitter with 3.23 million impressions reaching an estimated 240k people on April 4, 2012. The 6 top contributors delivered 69% of the total impressions and they included: @EricTPeterson, @EndressAnalytic, @johnlovett, @jennyweigle, @monishd, and @MicheleJKiss (who wasn’t even there!). If you’re looking for folks to get the word out on Twitter, consider this your shortlist.
Velocity: (measured in ReTweets and total impressions) Overall the most re-Tweeted tweet for the 24-hr period was by Erica Chain, who garnered 10 RT’s on her 140 character missive about Joan King’s Crate & Barrel presentation. Note to the velocity Tweeters: pictures get more RT’s! I had a chance to talk with Erica and learned of her amazing story which was an added bonus. But, Monish Datta won our cash money prize for the most Retweeted Tweet as of 3PM. He attained 7 RT’s and over 16k impressions. Monish and team from Victoria’s Secret were well represented at ACCELERATE and they all added great value and velocity to the Tweet stream.
Penetration: (measured as the percentage of #Measure Tweets containing the #ACCELERATE hashtag) Over the course of the day, #ACCELERATE occupied 71.2% of all Tweets on the #Measure. Since we were delivering a fire hose of information during ACCELERATE, we encouraged attendees to Tweet out over our hashtag as well as the #Measure hashtag throughout the day. Apparently they listened because we dominated #Measure by sharing the free content delivered at ACCELERATE with anyone who cared to listen in, one tip at a time. One UK onlooker even commented that either it was lunchtime or Twitter had crashed as our activity came to an abrupt slowdown during our noshing hour.
Impact: (measured as the perceived value generated by ACCELERATE) The true impact of this event is best measured by the actions that attendees will take when they arrive back at their desks and apply their newfound insights into their daily work. While this is a real tough one to quantify, measuring impact on these types of things always is. For me and my Partners at Demystified, we gauge our success by the speaker feedback we receive, the generous donations to our Analysis Exchange scholarship fund, and through the comments that we get from individual attendees. By all measures, this was a smashing success.
The skeptics were quick to pounce on the paltry figure, with #WhoopDeeFrigginDo’s and “rounding error” rhetoric (see the Storify.com synopsis). And I agree, that half a percentage point, by anyone’s count isn’t a whole lot of impact. Even when it equates to $7 million bucks in a $1.25 billion dollar day of digital shopping. However folks, remember that all online sales last year represented just 7.2% of holiday cha-chingle in retailers’ pockets. According to comScore’s numbers that’s $32.6B in digital business over the 2010 holiday shopping season. Yet, how many of the total $453B in last year’s holiday sales…or this year’s forecasted $469B in holiday sales…were/will be ***influenced*** by online channels? The answer is a lot.
According to research firm NPD, 30% of all holiday shoppers plan to buy online this year, with the numbers even larger for high income households. Further, a full 50% of shoppers will turn to the Internet to research products prior to buying this year. And this that doesn’t include another 20% that will rely on consumer reviews and 4% who will turn to social media for their pre-buying intel. As we know, many of these shoppers will hit the stores with smartphones in hand, ready to get info or tap into their social networks as necessary.
My point is that if you’re so narrowly focused on social media that the only reason you’re in it is for the money…then you’re missing the point. Social media is today – and will be tomorrow – an enabler. It’s a method to engage with people on a meaningful level and to allow them to engage with one another. As a brand, if you can’t see this then you’re totally missing the point. It’s not all about the Benjamin’s. Social media ROI is important, but trying to pin everything down to bottom line metrics will have you “blue as hell” when it comes time to tally the numbers.
Instead, work to identify other Outcomes for your social media objectives that ***don’t have*** direct financial implications, but that ***do have*** business value. Demonstrating that your social channels reduce call center costs, elevate customer satisfaction, or simply drive awareness of your in-store promotions will deliver value deep within the business.
I’m all for generating ROI from social media activities and making direct revenue correlations when they exist. Yet, in today’s world, social media isn’t just about the bucks. It’s a means to deliver better experiences for the many people who turn to that channel.
If you’re interested in learning more about Activaing Your Socially Connected Business, download Chapter 3 from Social Media Metrics Secrets, courtesy of IBM.
Google’s Eric Schmidt appeared today at LeWeb 2011 and dropped some notable quotes during his interview with conference organizer Loic Le Meur (@loic), including this prescient perspective: “It’s reasonable to say that in the future, the majority of cars will be driverless or driving-assisted.” Foreshadowing perhaps? Could be…but closer to reality:
Google’s Executive Chairman also quipped, “It’s easier to start a revolution and more difficult to finish it.” Google should know. They’ve been revolutionizing the way in which consumers interact on the Web since their inception and news posted today following the LeWeb chat follows suit.
The news reveals a new initiative launching today called the Social Data Hub. What’s even more exciting is the Google Analytics Social Analytics reporting to appear sometime next year. While the details were somewhat vague, I got the inside scoop and what was published should be enough to incite a minor frenzy in the Social Analytics circles.
The “Social Data Hub” is a data platform that is based on open standards allowing Google to aggregate public social media posts, comments, tags, and a plethora of other activities using ActivityStream protocol and Pubsubhubbub hooks. (Yea, that’s a real thing…I had to look it up too.) Early partners in the initiative include social platforms such as Digg, Delicious, Reddit, Slashdot, TypePad, Vkontakte, and Gigya among others. Of course Google’s own social platforms, Google+, Blogger, and Google Groups are included as well. Noticeably absent from the list are social media moguls like Facebook, Twitter, and LinkedIn who have yet to buy into the new Googley idea of a Social Data Hub.
If you’re scratching your head wondering how this is different than Google just trying to get more of the world’s data, you’re not alone. At first glance this may seem like yet another big enterprise ploy to get more data (and oh yeah, Don’t be evil). Well, I see this as a huge win for marketers, bloggers, publishers and anyone else trying to discern the impact of social media marketing across the multitude of channels and platforms available today. Currently, most marketers are forced to evaluate their social media activities through the lens that the platform (or their social monitoring tool) offers. Typically this yields low-hanging counting metrics which can be of some value, but more often than not end up as isolated bits of information that don’t provide business value.
Getting at this all important business value in many cases requires wrangling the metrics into another system, processing data and just generally working hard to gain some incremental insight. This is laborious work for the average marketer, so it’s no wonder that eConsultancy just reported that 41% of marketers surveyed had no idea what their return on investment was for social media spending in 2011. Yikes!
Google’s new Social Data Hub – coupled with Google’s Social Analytics reporting – has the potential to knock the socks off these unknowing marketers. By aggregating data from multiple social platforms into the Social Data Hub, they have the ability to make comparisons across platforms to show which channels are driving referrals, which are generating the most interactions, and which are potentially not worth investing in. It’s not that big of a stretch to imagine Google linking this information to data within their Google Analytics product such as Adwords, Goal completion rates and cool new flow visualizations. If/when Google applies the lens of their analytics tool to this new aggregated data set, look out marketers — you just hit the jackpot! Of course, I’m speculating here, but the possibilities are intriguing for a Social Analytics geek like me. That is of course, if platforms open their APIs to the Social Data Hub. A big if…
So Why Would a Platform buy into the Social Data Hub?
Well, it’s questionable if Facebook ever will opt in for this system so I wouldn’t hold your breath on that one. However for other social platforms, being part of the hub has some distinct advantages. They get to prove their value by partnering up with one of the only solutions on the Web that is capable of providing real comparative data on the performance of social channels.
This is a no-brainer for fledgling platforms that want to increase their visibility and even for established players, opting into Google Social Hub could mean the difference in gaining advertising dollars from skeptical marketers. While the big dogs in social media may take a while to come around, I see this new Hub as a potentially great equalizer for understanding the impact of social media as it relates to referrals for on-site activities which can ultimately lead to conversions and bottom line impact.
While today’s announcement may be just a small ripple in the social media pond, I see big waves building for Marketers. But that’s just my take on the disruptive and revolutionary force that is Google…
Those of you who follow my blog, my work here at Web Analytics Demystified, and my active participation in the WAA are well aware of my interest and commitment to consumer data privacy in web analytics. In addition to my role as liaison to the Standards committee within the Web Analytics Association and participation in numerous issues related to protecting personal privacy, I am also the co-author of the Web Analysts Code of Ethics.
These efforts have been primarily focused on the web analytics community in an attempt to elevate our own knowledge of how our work and actions are perceived by the outside world. And, thanks to a long-and-growing list of Code of Ethics signers, I believe that we are starting to have an impact.
Despite my participation and involvement in WAA Standards and Code of Ethics work, there has been one nagging and persistent concern: actionability. Given the right intentions, what action can business owners take to clearly express their commitment to digital consumer data privacy? There really hasn’t been anything …
Today, I’m pleased to announce on behalf of myself, Adam and Eric – along with our partners at BPA Worldwide – that we are very proud to introduce our Web Analytics Demystified GUARDS program. This is the world’s first audit and certification for digital marketing and measurement platforms.
The GUARDS audit is a comprehensive and actionable assessment of how consumer data is flowing into – and through – the Enterprise via a wide range of online platforms and solutions. Designed primarily for the C-suite and corporate shareholders — the group at the greatest risk from privacy-related litigation but also least likely to have day-to-day knowledge of the who, what, where, when, why, and how of online data collection — the GUARDS audit provides a clear and actionable plan to ensure the highest-level of protection for consumer collected, personally identifiable data.
Those clients who demonstrate an appropriate balance of data collection and data governance will be eligible for GUARDS certification. This certification is Web Analytics Demystified and BPA Worldwide’s assurance that consumer data is being used in a respectful, appropriate, and secure manner within the business. We have built this program for companies who want to get out ahead of proposed legislation and be proactive in their commitment to the digital consumer.
We are incredibly excited to be partnering with BPA Worldwide on this effort and are happy to answer any questions about GUARDS that you may have. Please contact us directly to discuss the effort or to arrange a GUARDS audit for your business.
So it’s that time of year again when commercialism runs rampant, people spend with reckless abandon, and at any moment there could be fisticuffs at your local Wal-Mart. But alas, this is Holiday Season in America, so be joyous about it!
Earlier this month, I gave a presentation at the Columbus Web Group meet-up that I titled Mythbusters: Analytics Edition. The more I worked on the presentation — beating the same drums and mounting the same soapboxes I’ve mounted for years — the more I realized that the Discovery Channel show is actually a pretty useful analog for effective digital analytics.
Based on the very successful roll-out of our Advanced Analytics Education offering at ACCELERATE 2013 Web Analytics Demystified is delighted to announce our “Adobe Intensive” sessions in Portland, Oregon April 23rd and 24th, 2014. We will be packing decades of knowledge into two days of Adobe-centric training and covering SiteCatalyst, ReportBuilder, Discover, and Test & Target, all for one low price.
Over the past year or so, I’ve had the opportunity to see some "do's" and "dont's" when implementing a tag management system. I thought today I’d share some thoughts on the most important item on the "do" list: Every good TMS implementation I’ve seen is supported by a carefully planned, well-documented data layer.
A lot has been written about “big data” in the past two or three years — some say too much — and it is clear that the idea has taken hold in the corner offices and boardrooms of corporate America. Unfortunately, in far too many cases, “big data” projects are failing to meet expectations due to the sheer complexity of the challenge, lack of over-arching strategy, and a failure to “start small” and expand based on demonstrated results.