Archive for the ‘google analytics’ tag
By Ali Behnam | October 24th, 2011 at 7:01 am | 1 Comment
One of the many methods through which Tealium improves site performance is by using asynchronous loading of tags. This method is becoming increasing popular, especially after Google’s adoption of asynchronous tags in 2010 for its analytics product. Tealium is one of the pioneers in asynchronous tracking, first adopting the methodology for tag management back in 2008.
But what exactly is asynchronous tracking and how does it improve site performance?
Before explaining the asynchronous method, it is important to note how tags load and why they slow down web sites. Most tags today are loaded in synchronous manner (think serial). When a page loads a synchronous tag, it waits for the tag content to load before moving on to the next content. The figure below shows an example of a page loading 4 tags in a synchronous or serial manner. The page starts by loading the first tag. After the tag has been completely loaded, the page moves on the second tag. The process is then repeated for the ensuing tags. Assuming each tag takes half a second to load, we’re looking at a total load time of 2 seconds to load all 4 tags.
With asynchronous tracking, the browser can load the different tags in parallel. It no longer has to wait for a certain tag to load completely before moving on to the next or the rest of the page content. This is shown in the figure below. Using our example of the page with 4 tags, we can see that the browser starts downloading the 4 tags in parallel, completing the process much faster. The 4 tags in this case are completed in a fraction of a time compared to the synchronous method.
Asynchronous tracking has many benefits. One obvious benefit is improved site performance, as demonstrated above. Another benefit is improved tracking and data accuracy. Because tags are loading in parallel to the rest of the content, they can be placed on top of the page, which improves the accuracy of data being collected.
By Ali Behnam | July 17th, 2011 at 2:21 pm | 0 Comments
One of the conversations that digital marketing teams engage in today’s environment is whether they need a tag management system or not. As companies are increasing their use of digital marketing technologies, their need for a tag management solution increases. Up to now, most companies seeking a tag management system typically shared one of two characteristics:
- They deployed lots of different tags (i.e. analytics, ads, affiliates, etc.)
- They had a constant need to change tags (i.e. new marketing programs, trial of new vendors, etc.)
There are of course other factors that can prompt organizations to invest in tag management systems. For a thorough list, we recommend that you check out the Forrester research titled “How Tag Management Improves Web Intelligence” by Joe Stanhope.
Up to now the group least likely to require a tag management system included the two following characteristics:
- They don’t use lots of tags
- The have no plans to change their vendors
Sounds logical right? Think again.
Just recently, Google introduced a new feature in Google Analytics which provides valuable reports around page load performance, called Site Speed Reports. The new feature helps companies determine the impact of site performance on their web site conversion.
In order to take advantage of this features, customers will have to update their tracking code in the following manner:
var _gaq = _gaq || ;
The change of tracking code is not unique to Google Analytics. Just recently, Yahoo! announced that the Yahoo! Web Analytics tracking code V4 will no longer be supported.
In both cases, tag management systems let customers update their tracking code without changing their pages. For Tealium customers, this is just a matter of changing their templates from within the Tag Management Console.
We’re no longer seeing tag management as a technology deployed by customers that want to constantly add or change their vendors, but also by organizations that want to make sure that they’re deploying the latest version of their vendor tags. To find out more about how Tealium can help you leverage the latest features from your digital marketing vendors, contact us.
By Ali Behnam | September 16th, 2010 at 8:00 am | 0 Comments
Tag Management has arrived. While attending this year’s Google Analytics Certified Partner (GACP) Summit, one of the topics of discussion among attendees was universal tagging and tag management. This of course is great news as it validates our vision dating back to 2008, as we first introduced universal tagging for web analytics.
What was once a vision and a consulting project here at Tealium is now a topic of discussion at web analytics conferences, including the upcoming X Change. And as marketers’ knowledge of universal tagging increases, so does the product’s maturity. A key requirement has been the ability for business users to manage their tags without IT involvement.
Enter Tag Management Console.
The Tag Management Console is the admin console for Universal Tag Deployments. Within the Console, non-technical users can manage their tag using a drag-and-drop interface.
Want to add DoubleClick tags to your site? Simply drag the DoubleClick icon and enter your account ID.
Want to add Google Analytics? Drag the Google Analytics icon and enter your account ID.
Want to add Eloqua tags for better lead nurturing? Well, you get the idea.
Once configured, users can add, edit and delete their marketing tags and manage complex implementations such as changing mappings from one vendor variable to another. Again, all this is done without straining valuable IT resources, meaning that you can make changes to your tags in hours or days instead of weeks and months.
Contact us to schedule a live demo of Tag Management Console.
By Ali Behnam | May 24th, 2010 at 9:00 am | 1 Comment
In our last post, we talked about the use of Universal Tag to improve your web analytics implementation. In this post, we are going to discuss another major benefit associated with the Universal Tag.
Universal Tag is about having a better web analytics process.
Web analytics is an iterative process. A typical web analytics cycle is shown here. First, users deploy their web analytics tool. From there, they analyze the data, and make changes to their sites based on findings. The cycle then repeats itself. However, in some cases, the findings may require users to look at the data in a different angle. Often times, the new angle will require a change in web analytics implementation, which means re-tagging the site.
To demonstrate this, we’re going to discuss an analysis that we recently did for a technology company. This client sells expensive enterprise software and uses a large number of white papers in order to educate its user base. As part of the analysis, the client wanted to know if white papers have a positive impact on site conversions, which is lead generation. The client’s tool of choice is Google Analytics.
To do the analysis, we used the “Visits with Conversion” segment and looked at the downloaded files for the segment. This will show us which files were downloaded during the same session where the lead was captured. The results were initially shocking. For this particular segment, we saw about 30% less white paper downloads than an average session. Are we to believe that converting visitors are less interested in white papers than non-converting ones? This meant that we needed additional information.
The next hypothesis was that visitors download the papers, read them and then come back to the web site and submit their information. In order to prove this new hypothesis, we had to make an implementation change since Google Analytics does not provide this level of cross-session analysis without customization.
The solution was to use a visitor-scope custom variable to capture the downloaded document and look at the “download” custom variable report for the “converting” visitors.
With default web analytics deployments, this requires editing the tagging within the download pages, which is a laborious process that will involve the web development team. However, through the Universal Tag, this process be can implemented without a single page tag change.
Following this change, the discovery proved our hypothesis. In fact, we learned that it takes an average of 2 days between a white paper download and a lead registration. This exercise clearly showed the dangers of relying only on session-level data when dealing with complex sales.
Universal Tag made this discovery possible without re-tagging. Because organizations can fine-tune their implementation without costly re-tagging exercises, they can learn faster and therefore get more value from their web analytics investment than those using standard tags.
By Ali Behnam | May 16th, 2010 at 6:42 pm | 2 Comments
It has been a while since our last blog post. We’ve been busier than ever deploying the Universal Tag on client sites and I’m happy to say that the number of unique client deployments is approaching 50.
There’s more and more buzz around Universal Tagging every day and as a result, we’re getting the same questions from more people than ever: What is the Universal Tag?
In this multi-part series, we’re going to share our thoughts as to what the Universal Tag is and what it’s not.
In this post, we’re going to cover what we see as the first misconception about the Universal Tag.
Misconception: Universal Tag should be used by those that are in the process of switching vendors.
Fact: Based on two years of experience, we can categorically say that this is not the primary benefit that Universal Tag provides to clients.
Universal Tag is about fixing your current implementation.
That’s right. Out of almost 50 deployments of Universal Tag, only two clients have signed up in order to switch vendors. The large majority has no plan to switch vendors. Yet they recognize that their implementation can be vastly improved. Universal Tag provides them a platform to do just that.
The reason is because the Universal Tag provides a simplified platform for tagging compared to traditional vendor tags. It also changes the best practice implementation considerably. Today’s web analytics tools require you to think well in advance about the different types of reports that you want to get from the solution. Only after you have a good understanding of what your reporting needs are can you start the tagging process.
The Universal Tag framework changes this by letting you send generic data and map it to vendor-specific syntax at any time. This changes best practice implementations in that it lets you just send the data. The rest can be handled through the Universal Tag.
Here is a real-life example that helps demonstrate the point.
Example: Product Syntax
Consider a scenario of a product page. Within the page, you’d like to capture several components, including the product name, product size, color, and number of ratings received. As far as reporting is concerned, you’d like to get reports on top products viewed, top colors and sizes viewed, as well as average ratings of products (a numerical report) and a histogram or bar chart report of reviews (how many views for products with rating 1, rating 2, …).
Sounds simple, huh? Lets look at how you would go about implementing this with the two most popular tools on the market: Google Analytics and SiteCatalyst. The examples that we’ll use will be for a cotton shirt, color: white, size: large and a rating of 4.5.
First Google Analytics. We’re going to use custom variables to capture product name, size and color. The challenge here is that not only your developers have to know about the specific syntax, but they should also be aware of the fact that you can have visitor, visit or pageview-based custom variables. Now there’s no such thing as a numerical custom variable in Google Analytics, so you’ll have to use the event tracking feature in order to get your numerical ratings report. The implementation syntax will look something like this
pageTracker._setCustomVar(1,"product view","cotton shirt",3); pageTracker._setCustomVar(2,"color","white",3); pageTracker._setCustomVar(3,"size","large",3); pageTracker._setCustomVar(4,"rating","4.5",3); pageTrack._trackPageview(); ... pageTracker._trackEvent("product view","cotton shirt","rating","4.5");
Let’s try the same thing with SiteCatalyst. We’re going to assume that we’ll use prop1 and eVar1 for size, prop2 and eVar2 for color, prop3, and eVar3 for rating and event1 as the numerical event used to measure the average rating. The implementation syntax will look something like this:
s.events="event1,prodView"; s.products=";cotton shirt;;;event1=4.5;evar1=large|evar2=white|evar3=4.5"; s.prop1="large"; s.prop2="white"; s.prop3="4.5";
Again, you’re requiring your development team to know what different props, eVars and events are as well as the exact syntax which should be used (for example using lower case evar for merchandising).
Now let’s look at what this same implementation will look like with the Universal Tag. Here’s an example syntax:
yourdata.product="cotton shirt", yourdata.size="large", yourdata.color="white", yourdata.rating="4.5", yourdata.page_type="product view",
Now what if you wanted to deploy both SiteCatalyst & Google Analytics? No changes. The implementation will be the exact same.
This simplified implementation has several benefits. The one that’s clearly being addressed in this post is that it simplifies implementations and vastly reduces the deployment cycle. Your development team no longer has to master the analytics tool being used and can concentrate on sending the data through the simplified tag. Your business team or analytics department can then translate this data into vendor-specific syntax.
In future posts, we will share some of the other benefits that we’re seeing with the Universal Tag. Stay tuned.
By Ali Behnam | March 5th, 2010 at 5:29 pm | 1 Comment
Last week at Online Marketing Summit (OMS) I had the pleasure of sitting in a panel of web analytics professionals, along with Eric Peterson, Matt Belkin from Omniture, Amanda Kahlow, Bill Bruno and Enrique Gonzalez from AARP.
First, I need to congratulate the folks from OMS for putting together a great show. There was a record attendance of over 800 professionals covering all areas of online marketing, along with a great lineup of presenters.
During the panel discussions, one of the questions asked was how should businesses deal with multi-touch attributions.
Here’s a sample scenario to help explain the pain point involved:
A visitor is interested in running shoes and conducts a Google search for the term “running shoes”. The visitor is presented a number of search ads from competing vendors such as Nike, Adidas, and others, and decides to check out Nike and Adidas sites. The visitor gets intrigued by the Nike ID line of products and decides to conduct some further research, and even registers for the Nike newsletter. While doing research on third-party sites, the visitor sees a banner ad for the Nike ID site and clicks the banner. Finally a day later the visitor gets an attractive email offer from Nike and ends up buying the shoes.
In this scenario, the visitor has been exposed to three separate campaigns. The “running shoes” search campaign generated the awareness. The banner campaign possibly helped increase awareness and instill further trust in the product and finally, the newsletter sealed the deal. By default, web analytics providers give credit to the last campaign touched by the user. In our example, the newsletter campaign will get the credit, whereas if it wasn’t for the search campaign, the visitor would not have even been aware of the Nike ID line. In fact, two variables that make multi-touch attribution a real challenge are:
- Number of simultaneous campaigns. If you’re a company running large numbers of campaigns in parallel, you should account for multi-touch attribution
- Complex or expensive product: the more complex the product, the longer the consideration and therefore the more likely you are to have multiple touch points.
So how does one tackle this challenge? First, for the large companies running many campaigns, there are a number of commercial solutions such as ClearSaleing that help solve this challenge (and a lot more). But what about smaller companies with small budgets using free solutions such as Google Analytics or Yahoo! Web Analytics?
First, we recommend that you investigate if you even have a multi-touch attribution problem. How? Let’s take another look at our example scenario. Two metrics within your analytics solution can give insight into this. They include time to purchase and number of visits prior to purchase (or conversion).
For example, if you use Google Analytics and have e-commerce tracking, you can use the “Visits to Purchase” report to see how many times do visitors come to your site prior to purchasing. If you are a lead generation type web site and have your conversions set up as goals, you can use the default “Visits with Conversions” segment and look at the loyalty report for the segment. In both cases, if most of your conversions come from first-time visitors, then multi-touch attribution is not going to be a problem for you and the rest won’t apply.
However, if you happen to see a big difference between converting visitors and others, then you can build a quick attribution report by following these steps:
- Push the cookie value into a custom variable – such as a visitor-centric custom variable in Google Analytics, a session-based custom field in Yahoo! Web Analytics or an eVar in Sitecatalyst.
You now have a simple yet powerful solution for seeing which campaigns your visitors are responding to, but also in what order.
By tealium | January 27th, 2010 at 9:00 am | 0 Comments
We are pleased to announce the availability of version 2 of Tealium Universal Tag. The new version provides many new enhancements following several enterprise-level web analytics deployments with large number of platforms, including SiteCatalyst, Omniture Insight, Google Analytics, Yahoo! Web Analytics, Unica NetInsight, Webtrends and Coremetrics, as well as a number of digital marketing solutions such as DoubleClick, Atlas, ForeSee Results and more.
Some of the new functionality include:
- Improved multi-vendor support: the new version provides a superior method for complex implementations with multiple vendors. For example, non-technical users can map page tag values differently into various web analytics solutions, while also mapping them to their PPC bid management tool.
- Attribution management: designed specifically for clients using multiple affiliates, version 2 of Tealium Universal Tag has the ability to conditionally send data only to the winning affiliate(s).
- Multi-currency support: the new version of Universal Tag supports transactions in multiple currencies for digital marketing vendors that do not provide such support by conducting on-the-fly conversions to the supported currency.
- Universal data capture: this feature allows non-technical users to automatically capture data elements from the page and map them to their web analytics and digital marketing solutions. Examples of such data elements include microformats, meta tags, in-page style elements, query parameters, cookie values, etc.
We’ll be publishing a number of case studies on Universal Tag deployments soon. In the meantime, to see Universal Tag in action, please contact us.
By Ali Behnam | January 19th, 2010 at 10:00 am | 0 Comments
One way to make web analytics actionable is to break the site into different sections (such as home pages, category pages, etc.) and generate reports specific to those pages/sections. In this post, we’re going to identify some of the most common reports for analyzing home pages.
First, lets start by defining home pages and their goals. The home page is typically the main gateway page for your site. It’s the first impression that your visitors will have of your site. Its role is to showcase your offerings, your value proposition and provide quick access to the most popular or important sections of your site. For this reason, web analytics should help you answer some of the following questions:
- How effective is the home page at directing visitors to product pages?
- Which part of the home page is the most effective?
- Is the home page effective at enticing visitors to learn more?
Based on these, below are some popular web analytics reports for home page analysis along with the explanation:
- Bounce Rate
- Micro Step Conversion Rate
- Conversion Rate
- Acquisition Sources
- Home Page Real Estate
The bounce rate is defined as the number of bounces (single page visits) divided by entries. It shows you what percentage of the traffic landing on the page bounces and does not view any other page on the site. It is a reflection of the home page’s ability to retain visitors. Clearly the goal is to make changes to the home page and lower the bounce rate. It’s probably one of the best reports to look for when analyzing home pages. This report is widely available in most web analytics tools such as Google Analytics, Yahoo! Web Analytics and Unica NetInsight.
Micro Step Conversion Rate
Although the ultimate goal of your site is to drive conversions, we recommend micro step conversions as a better way to assess home pages. The goal of your home page is to drive people to your product description pages. It’s at that level that you do the selling. For this reason, when assessing the success of your home page, it should be around its ability to get visitors to those ensuing pages. You can get this in a number of way. Inside tools such as Yahoo! Web Analytics and SiteCatalyst, you can tag your product description pages as events and look at the success of your home page around this event. In Google Analytics, you can create a goal for your product pages, as long as the pages have a consistent nomenclature. If not, you can create an advanced segment for your product pages and look at the home page traffic for the segment. Such metrics can pretty easily be created inside Unica NetInsight and Webtrends.
Yes, this should not be your primary report for home page analysis, but you can still use this report as a tie-breaker. For example, if two versions of home have similar bounce and micro step conversion rates, then you can use the overall conversion rate to see if one version does in fact do a better job. Unfortunately, we often see that many people use conversion rate as the primary report for assessing home page effectiveness.
Want to lower your bounce rate? One place to start is by looking at the acquisition sources. You can start with the sources of traffic to your home page and look at their respective bounce rates. Start with referring sources with high bounce rates. Often, you’ll find a messaging gap between the referring sites and your home page. The referring site may be saying something while your home page could be promoting something else. While you cannot optimize your home page for all referring sites, you can start with those with high traffic and high bounce rates and provide messaging on your home page that helps retain this incoming traffic. You’ll typically find that a handful of sites may account for a high percentage of your bouncing traffic.
Home Page Real Estate
To understand the real estate effectiveness, you’ll have to look at the click activity on the page. Rather than looking at all page links, we recommend classifying the link into sections or categories (such as as header, footer, navigation, left box, right box, etc.), and analyzing the activity by such sections. This is different than the default site overlay that you typically get from web analytics tools and requires some additional configuration to get proper reporting. For example, if you’re using Google Analytics we recommend using Event Tracking to track the activity on various sections and links within sections. You can then see how effectively each section and each link gets visitors to product pages and to final conversion.
You can also investigate some of the in-page analytics tools such as CrazyEgg and ClickTale, which do a more thorough job of providing such reports than web analytics tools.
Of course, depending on your business, your reporting needs may vary, but we believe this list should provide a good starting page for optimizing one of your most important pages.
By Ali Behnam | January 5th, 2010 at 10:00 am | 0 Comments
If you’re a media site, one of the most critical measurement objectives is to assess the success of your content. But how does one go about measuring this? Default web analytics reports often fall short in this area. Let’s take a look at some of the most popular content metrics provided by the analytics solutions.
This is probably the best out-of-the-box metric for measuring the success of a content. The more the number of page views, the more popular the content. However, relying on this number alone has two potential shortcomings. First, it fails to differentiate between segment traffics. For example, a loyal visitor is more valuable to a content site than someone who visited the site for the first time and will likely never come back. Also, page views alone fail to report the level of engagement on the page. For example, visitors could be clicking an article and spending only a few seconds on it. The quality of traffic should therefore be accounted for.
Time spent on page
This metric clearly adds a new dimension around engagement. The more time visitors spend on the content page the more engaged they are. However, you cannot rely on this metric alone. One key reason is the fact that this metric is not always available to all visitors. For example, if the content page was the only or the last page viewed during the session, then this metric is simply not calculated within popular web analytics solutions (we’ll discuss this in a separate post).
Another shortcoming of this metrics is that like page views, it fails to segment the reports by the quality of visitor (first-time vs. loyal).
And finally, the Time Spent metric alone does not take into account the popularity of the content. For example, an article could be very engaging but only be viewed by a handful of people.
Web analytics solutions often provide this in context of the overall site traffic and you may have to do some tweaks to your reports to get this, but it’s important to note what percentage of your content is consumed by first-time visitors and what percentage by loyal visitors – visitors that come back to the site. The reason this is important is because in the long run, you may want to create a loyal following and create content that’s tailored to them.
This is also one of the most popular metrics within analytics solutions, but media sites should be careful not to over-analyze their bounce rates. As an example, consider a media site with an RSS feed. Through the RSS feed, visitors can see the headlines of new content using their favorite RSS aggregator. If an article looks appealing, they click the link, enter the site, read the content and then leave. That’s a bouncing visit but still a highly qualified traffic, because the visitor has subscribed to the RSS. The visitor loyalty metric indirectly takes care of this shortcoming.
Content Engagement Score
In this post, we’d like to introduce you to a content scoring KPI that we’ve used to help some of our media clients put a monetary value next to their content.
The formula is as follows:
Engagement Score = (Page Views × Avg. Time Spent × Avg. Loyalty)
- “Page Views” is the number of times the page was viewed during the reporting time period.
- “Avg. Time Spent” in the average number of seconds or minutes spent on the page by visitors.
- “Avg. Loyalty” is the average number of visits to the site by your visitors (1 for first time visitors, 2 for those who’ve been to the site twice, and so on).
Of the three metrics needed to create this KPI, “Avg. Loyalty” is the most difficult to get, but this can be obtained done using estimates in popular tools. For example, with Google Analytics, you can use the %New Visits metric to estimate the average loyalty. You can use the following formula for this purpose:
Avg. Loyalty = (%New Visits) + 2 * (1 - %New Visits)
What this formula does is that it assigns a score of 1 for each new visitor and a score of 2 for all others, providing a reasonable approximation. You can create a similar model with Yahoo! Web Analytics – see below figure for an example of such report in Google Analytics.
Using this model, pages with the highest traffic, time spent and the most loyal visitors will get the highest scores, which is the desired outcome. You can of course use any analysis tool to create your score. One popular tool is Microsoft Excel, where the score can easily be created and analyzed. See figure below for an Excel example. It shows that our posting for tracking internal campaigns is the most engaging even though it’s an old blog post.
Overall, this model provides a simple KPI for measuring site content, while taking into account the popularity, engagement and the quality of the visitor. It does however have its shortcomings. The primary shortcoming is that it is dependent on cookies. For loyalty to be counted, visitors have to accept cookies. Furthermore as visitors delete cookies, it will impact this KPI. However, it’s fair to assume that visitor cookie deletion is not dependent on their content preference, so you should expect the same rate of deletion across the board.
The metric also depends on time spent reporting, which is not available to all visitors. Having said that, it’s also fair to assume that the time spent by those who view a certain content as their last page should be inline with those who view the content in the middle of the session. After all, the purpose of this model is to provide an approximate score for content engagement and popularity.
You may also be in a mode where loyal visitors are no more valuable than first-time visitors. For example, newer web sites fall into this category. In that case, you can simply omit the “Avg. Loyalty” metric from the formula (or replace it with the value 1).
So there you have it. We welcome your feedback on the model and hope you find it of use.
By Ali Behnam | October 20th, 2009 at 11:13 am | 2 Comments
It has by now become a tradition for Google to announce new features of it analytics solution during the popular eMetrics events. The company announced its entry into enterprise web analytics during last year’s eMetrics show in DC, when it launched Advanced Segments and its API. This year’s eMetrics show in DC was no different, when Google announced some of its most exciting features yet. Here’s a summary of new exciting features that you can now find in Google Analytics.
Ever wondered how to navigate through the mountains of data and make sense of them? What do changes in trends mean to your business? Whether you should be concerned about them or not? What if the analytics solution automatically gave you clues about important changes to your site based on past performance and statistical model? Enter Analytics Intelligence. This exciting new functionality automatically alerts you of important site changes based on 11 dimensions and 18 metrics. With this feature, making sense of trend changes becomes an easier task than ever before. Spend less time analyzing data and more time improving your web site.
Although Analytics Intelligence is a great start, web analytics practitioners still know their business better than Google Analytics ever will. The new version of Google Analytics also lets you set customizable alerts based on events that are important to your specific needs. For example, you can create alerts if your social media traffic varies more than usual.
In our opinion this is the most exciting new feature in Google Analytics. The new version of Google Analytics now lets customers send custom data points to Google to be analyzed as extra dimensions within their analytics account. Want to track additional data per page such as author, category, topic, genre, etc.? You can with Custom Variable. The new version lets you pass up to 5 simultaneous custom variables, with full control of their scope, including whether the variables are set at the page, session or visitor levels.
Extended & Threshold Goals
One of the primary reasons for using multiple profiles was a way to work around the 4 goals/profile limit. The new version of Google Analytics now supports up to 20 goals, with the ability to classify them in goal sets. Additionally, you can now create threshold goals: goals that are set based on engagement thresholds such as the amount of time spent on site or number of page views per session. This is particularly a welcome addition for media sites that need to use engagement thresholds as goals.
Advanced Table Filtering
This feature gives you more control in terms of how you want to filter your data. Example includes the ability to filter data based on multiple dimensions or metrics thresholds such as bounce rate figures.
Expanded Mobile Tracking
As more people use their mobile phones to browse the internet, there’s a growing need to track mobile usage of web site. This feature is welcome news for mobile marketers and site managers who need to better understand their visitor experience.
Unique Visitor Metrics
These new metrics provide a more comprehensive view of the dimensions reported in Google Analytics such as referring sources. Want to know how many unique visitors your various campaigns and marketing programs are generating? This is the answer.
Congratulations to the team at Google for coming up with another impressive release. This release further solidifies Google’s place as an enterprise web analytics solution.