Analysis Exchange

Free Web Analytics for Nonprofits

Posts tagged nonprofit

1 note

Thanks Pro Bono Net for Sharing Your Analysis Exchange Experience

Our national non-profit, Pro Bono Net recently participated in the Analysis Exchange program. We work in close partnership with nonprofit legal organizations across the United States and Canada, to increase access to justice for the millions of low-income individuals who face legal problems every year without help from a lawyer. We do this in part by supporting innovative and effective use of technology by the nonprofit legal sector through a variety of programs. LawHelp.org is a project of one of these programs and it serves as an online resource that helps low and moderate-income people find free legal aid programs in their communities, answers to questions about their legal rights, court information, links to social service agencies and more.

As an organization that supports an online resource, we realized the value of understanding analytics and data – but had some questions on how to harness the information. When we discovered the Analysis Exchange program, it became apparent that this could help us achieve our goal. Overall, our experience with the program was positive, and we’d like to share our experience and insights.

Our Project
For our project, we decided to focus on usability, specifically how users navigate our LawHelp.org and Spanish LawHelp.org homepages. We hoped to get a better understanding of how Google Analytics could help us see this navigation, and ultimately identify areas of need around site usability improvements. This took about an hour of work to articulate a project goal.

Shortly after submitting our project, there were a handful of possible mentors and mentees who expressed interest in our project. We were really impressed with the caliber of mentors and mentees who asked to be on our project and ended up working with Aaron (our project mentor) and Rosa (our project mentee).

We then moved quickly to arrange a time to virtually meet and introduce ourselves. For this meeting, we created a structured agenda that included introductions, an overview of our organization, our LawHelp program, our LawHelp.org site, our experience with Google Analytics, and an introduction to our project goals. We felt it was very important to do some context setting to provide Aaron and Rosa with the background they would need to successful execute the project. During this discussion, we also defined some concrete outcomes we hoped to see after verifying the scope of our project was workable from Aaron and Rosa’s perspective. The meeting prep and meeting itself took about two hours total.

After a few days processing our discussion, we worked with Aaron and Rosa to arrange a time to review their work. It took Rosa and Aaron about two weeks (during the holiday season no less!) to complete the project for our review. Aaron and Rosa were very prepared - they had a PowerPoint presentation prepared that was structured nicely and full of information. We needed to take time as an organization to digest their recommendations and how they fit into our larger technology plan, but Aaron and Rosa were very open to follow-up questions about additional resources, clarification points, etc. after the presentation. The final step was completed mentor and mentee evaluations. The final review, internal debrief with our staff, and evaluations took about three hours.

In total, we invested about six hours of time into the project; our mentee invested about fifteen hours of her time and our expert worked on the project for about 12-14 hours of his time. Additionally, throughout the process, the Analysis Exchange sent check-in emails and was quick to respond to any questions or concerns that we had.

Outcomes and Next Steps
The final result of the project was a set of comprehensive, expert recommendations to help our program staff and tech team make improvements to our Google Analytics profile. These improvements will help us more accurately configure our Google Analytics and provided us with ideas for implementing changes to help us better understand user behavior on the site. These steps will ultimately lead us toward our desired usability improvements on LawHelp.org and LawHelp.org./espanol.

Considerations
Based on our experience, here are a few lessons learned and considerations for LawHelp community members interested in the Analysis Exchange:

  • Check out the Analysis Exchange’s additional resources, such as their video explanation on how to create a project, their Opportunities and Expectations Handbook, and their blog for case studies.
  • Define your goals carefully. It is the key to a successful project. Make sure that the scope is not too broad, and make sure the goals lend themselves to concrete outcomes and steps you can take.
  • Take time for introductions. Introduce your organization, website, website strategy, and project goals to your mentee and mentor for larger context. Don’t forget to find out more about their interest in the project too!
  • Clarify concrete outcomes you hope to gain from the project. This will help to set expectations around the project final product and provide you with concrete, implementable next steps.

We enjoyed our involvement with the Analysis Exchange and encourage other non-profits to consider this valuable resource!

posted by: Jillian Theil

Filed under nonprofit measure

0 notes

Analysis Exchange Announces Scholarship Winners

Thanks to our generous sponsors, ObservePoint, IQ Workforce, and Jim Sterne, the Analysis Exchange is awarding $500 in scholarship money to two applicants each quarter for their continuing Web Analytics Education, as well as a pass to an eMetrics conference of their choice.

We wanted to take a few minutes to share a little bit about our first and second quarter winners.  Congratulations to our winners: Joan Cole, Tina Arnoldi, Stefanos Kapetanakis, and Grace Begany. Thank you for volunteering your time to support the Analysis Exchange!

Joan Cole completed 3 Analysis Exchange projects and had an overall rating of 9.7/10 and outstanding reviews.  Joan was new to the field of Web Analytics and has been able to find some work in the field as a result of her Analysis Exchange experience. Joan recently served on a panel at the Chicago eMetrics conference to share her Analysis Exchange experience with other aspiring web analysts.  Joan is using her scholarship money toward the UBC Award of Achievement Program in Web Analytics.

Tina Arnoldi has been a part of the Analysis Exchange since May, 2011, as both an Organization lead and as a student.  Tina has a 9.7/10 rating on her evaluations. It is wonderful that Tina has been able to help other nonprofits, in addition to her own, get value out of using Google Analytics. Tina will be using her scholarship money to help pay for a conference where she is a presenter on Google Analytics!  Thanks Tina for continuing to spread the word!

Stefanos Kapetanakis has been a part of the Analysis Exchange since January 2012 and has received a 10/10 as a Mentor on 2 projects.  Stefanos has been able to impart his experience on new Analysts in the field and at the same time gain his experience evolving as a professional.  Stefanos will use his scholarship money to use towards furthering his web analytics education through completing the Market Motive class.

Grace Begany has completed 3 projects with the Analysis Exchange since January 2010, and has a 9.6/10 rating and outstanding recommendations!  Grace would like to help further extend and represent digital analytics activities within academia. As a soon-to-be doctoral student in Information Science, she would like to accomplish this goal by incorporating her existing digital analytics knowledge into her study and research at the university. The Analysis Exchange Scholarship will allow her to undertake a variety of knowledge-building activities, pushing her toward this goal.

Our next scholarship money will be granted this Fall.  If you have worked on an Analysis Exchange project and could use up to $500 to help further your web analytics education, you can find details on how to apply for the scholarship here: http://www.webanalyticsdemystified.com/ae/scholarship/index.asp

Filed under measure nonprofit

4 notes

Technical Audits Can Help to Define Analysis Exchange Projects

As a non-profit, you know that you have the ability to get free help from Web Analytics professionals by participating in the Analysis Exchange, and you want to make the most of this fantastic opportunity.  You know that you will be working with mentors, who are highly accomplished in their field, as well as students, who are eager to extend their digital measurement skills.  You also know that everyone will be volunteering their time. 

But, maybe, you are just starting out in Web Analytics, as many non-profits are, and the simple task of determining the goals of a project may seem intimidating, if not overwhelming, to you.  Requesting a Technical Audit for your first Analysis Exchange project may be a solution for you.

 

What is a Technical Audit?

                It may be surprising to hear that, out of the box, even the best analytics software doesn’t track everything you need to know about your website.  Believe it or not, there may be many marketing efforts that are missing from your reports, or some of your reports may be inaccurate due to various tracking issues.  How do you know which reports are working as you would want them to, and which ones are misrepresenting your data?

A technical audit is a thorough evaluation of your website’s analytics software implementation that allows you to understand what your software may be missing or reporting inaccurately.  It provides you with knowledge of your reporting limitations, and its ultimate goal is to identify issues that need to be addressed to ensure that key visitor behaviors are captured accurately so that your data-driven recommendations are reliable.  By requesting an audit from the mentor and student in your first Analysis Exchange project, the areas of greatest need will be identified, and you can prioritize your subsequent projects according to your requirements, while obtaining ideas for future projects.  This will allow you to gain greater confidence that the outcome of your first project will be successful, and it will also set the stage for future steps to take.  Utilizing this approach can give you the greatest bang for your buck, even though you’re not spending a penny!!  

Technical audits normally cost thousands of dollars, and their findings can be quite extensive.  Although all of the information below may not be included in a free audit, you can be assured that your mentor and student will provide you with findings that will improve your data-driven decision-making capabilities.  The following are among the findings you might receive from a free technical audit:

·         Identification of pages that are not tracked by current, functioning default JavaScript tracking codes or log files

·         Recommendations to improve Content and Audience Segmentation

·         Confirmation that signup and donation paths and confirmations are counted as funnels and conversions for more detailed reporting

·         Assurance that site search keywords and navigation impacts are measured

·         Recommendations for improvements to dashboard layouts, summaries, sharing, and underlying reports

·         Verification that all active marketing efforts are counted as campaigns and includes more detailed reporting by channel, source and creative

·         Confirmation of  integration of search engine marketing (SEM) pay-per-click (PPC) campaigns

In addition, an audit can be performed on your Google Analytics capabilities or your specific Campaign Attribution efforts can be evaluated.

 

Agora Partnerships’ First AE Project

Dorrit Lowsen, COO at Agora Partnerships, a non-profit organization that works with promising entrepreneurs in developing countries, was the beneficiary of this approach in her first project with the Analysis Exchange.   She had hoped to understand how website and social media data could help them understand how effective their web presence was at achieving each of the goals they had established.  Peter Howley, Project Mentor and Principal at Empirical Path, performed an abbreviated Google Analytics audit on the Agora website, and Joan Cole, Project Student, provided an analysis and summary of the tracking enhancements that were needed to improve Agora’s reporting capabilities.  The findings were enlightening: there was no tracking in place for campaigns, non-page events, ecommerce, or social media.  This meant that Agora could not identify which marketing efforts were the most successful, and that their reporting of Direct and Referral traffic was inaccurate. 

From learning about these reporting issues, Dorrit was able to make an educated decision about the ultimate goal of her first project.  She decided that campaign tracking issues should take priority on this project, and Peter and Joan provided her with a campaign tracking tool and methodology that addressed her most critical need first.  She also came away from the project with several ideas for future projects and the feeling that her first project was successful.  You can read more about the Agora Partnerships Analysis Exchange project here.

Submitted by Analysis Exchange Student, Joan Cole

Filed under measure nonprofit submission

125 notes

Top 5 Mentoring Lessons Learned in the Analysis Exchange

Last week, as I finished my first mentorship in the Analysis Exchange, I spent some time reflecting on what I learned from my first project and what I would carry into the next.  While I had some reservations about pursuing this fabulous web analytics opportunity initially, it was a great experience that I would recommend for non-profit organizations, students and mentors alike.  None of the below lessons caused our project to fail, quite the contrary, but in the spirit of continuous improvement, these are opportunities going forward that will hopefully help others in my shoes as well.  Live and learn is the name of the game, right?

1.     Time Management.  Before committing to the project, think about time management.  Will this project, given differing time zones require you to use any time from your full-time position?  Do you need to make arrangements to reserve this time for Analysis Exchange (AE)?  In my situation, our organization was based in Switzerland with a student in Oregon and me in Missouri.  There were limited times we could meet that wouldn’t mean one of us was losing sleep.  I did make efforts to minimize the AE work during my work day, which was helpful.  In our virtual working world, this won’t be a problem for most, but thinking through the logistics of how this will work is worth the planning effort.

2.     Project Plan.  This may seem like a given but once I began the project, I was so impressed with my student I didn’t feel a written project plan was of the utmost importance.  We had a plan with the organization, but it would have been helpful to have a written plan between the student and I to ensure we were both in synch regarding back off dates to review, edit and finalize the presentation.  We worked together harmoniously but this may have been helpful in the end when we became a bit crunched for time. 

3.     Mentor the Organization as much as the Student.  One of the points of feedback I received from the organization we worked with is that they didn’t receive many personal web analytics recommendations from me directly.  This surprised me at first because many of my suggestions had been incorporated and presented by the student.  My view of the project was that it was the student’s project and I would add background value, helping the student to shine.  What I failed to think through is that the organization didn’t see all of the back and forth communication between the student and I.  He wouldn’t recognize my efforts or contributions and understandably so.  The value of the mentor role then becomes a bit vague for the organization.  For the next project, I will try to engage the organization in the conversations the student and I have separately.  This may mean a mid-project call or two with the organization, student and I, but will help the organization learn just as much as the student.

4.     Rehearse , rehearse, rehearse.  As I mentioned, my student impressed me.  He far exceeded my expectations of the caliber of analytical skills I would find with the students in the AE.  Most of our communications were via email, which worked very efficiently.  However, in hindsight, another opportunity to provide feedback and help the student grow is by rehearsing the presentation on a call.  Part of the challenge of web analytics is delivering insightful analysis in a very simple and easily digestible format.  Presentation is half of this battle.  Providing feedback to the student regarding presentation style, flow, and timing just can’t be done as thoughtfully via email alone. 

5.     Just Do It.  Before participating in the Analysis Exchange, I was apprehensive of if I could do it.  I won’t even elaborate on the reasons why because they just seem silly now.  The Analysis Exchange is such an amazing experience for all parties involved.  What other prospect exists to do what you love while helping others learn and giving back to very worthy non-profit organizations?  None that I’ve seen.  Nike got this one very right – next time, I’ll just do it.  

Have you been a mentor, student or organization participating in the Analysis Exchange?  What lessons did you learn along the way?  What would you do differently the next time around?

Submitted by Analysis Exchange Mentor, Angie Bledsoe

Filed under measure nonprofit

122 notes

Analysis Exchange Announces Scholarship Program

We are incredibly happy to announce the creation of the Analysis Exchange Scholarship Fund. You can read the press release and learn more about the effort at the Analysis Exchange web site, but in an nutshell thanks to the generosity of ObservePoint and IQ Workforce we are now able to financially support Analysis Exchange member’s in their efforts to expand their web analytics horizons.  To read more about this effort, click here.

Filed under measure nonprofit

117 notes

Analysis Exchange Project with Kaufmann Mercantile

The most common question that we are asked by students at the Analysis Exchange is: 

“How can I make sure that I get chosen for a project?”

Our experience has shown us that there are three proven ways to get yourself onto a project.  They are: 

 

1.  Make sure that your profile in the system is up-to-date and does a good job of selling yourself to a prospective nonprofit.  This is one of those rare times in life when you get to brag about yourself so make sure you highlight your strengths and why they should pick you over others!

2.  Be sure to continuously apply to new projects.  Check-back often or follow us on Twitter, @analysisxchange, to see when new projects are posted.  We are working on a way to e-mail students and mentors when new projects are posted. 

3.  Bring in your own organization.  If you know of a nonprofit that could use help, have them come to the Analysis Exchange and make sure they pick you!  Control your own destiny! 

 

For our part, we are doing our best to get as many nonprofits to join the Analysis Exchange as possible, but the reality is that we have 815 students in our system and only 250 nonprofits so you may have to be patient.

Another way that we are hoping to get more potential projects into the Analysis Exchange is by opening it up to  start-ups that are not yet profitable.  Our hope is that by allowing these organizations to take advantage of the infrastructure already built for the Analysis Exchange we can increase our project base and create more learning opportunities for our 815 students. 

We recently did a trial  project with our first start-up website, a company  based in New York - Kaufmann Mercantile.  This project demonstrated the power of the Analysis Exchange and even had a mentor from Denmark!  

The project team was made up of the following people: 

  • Sebastian Kaufmann - the Organizational Lead based in NYC
  • Casper Bilcher Olsen - the Project Mentor based in Denmark
  • Brian Wonch - the Student based in Chicago

The following is an interview that provides an overview of the Kaufman Mercantile project.  Thanks to Kaufman Mercantile and the Analysis Exchange Team for sharing this with us!! 

AE: How did you first hear about the Analysis Exchange?

Sebastian: A friend of mine, Ryan MacCarrigan from Lean Startup Machine (http://theleanstartupmachine.com/) told me about analysis exchange. 

 

AE: How did you do Web Analytics before the Analysis Exchange? 

Sebastian: We have had a Google Analytics account since the beginning of our store. I checked it occasionally, but didn’t really know what information to pull out of all this data.

 

AE: What was your Analysis Exchange project objective? How does web analytics fit within the business structure of your company? 

Sebastian: Our objective was to find weak parts of our website and see where people were leaving the site and how we can avoid this. The bottom line was that we wanted to improve our conversion rate. We’re a small company and don’t have a person assigned to the subject of web analytics. Therefore I was working directly with the team. It was a great learning experience that way. 

Casper:When we first started the project there was no doubt that the previous configuration and customization of their Google Analytics account was non-existent. It was therefore necessary to first look into how the data was collected on the sites. The result was to modify their Google Analytics tracking script, so that it didn’t cause self-referals within the same domain.  

Based on the organization structure, the challenge was to make Web Analytics data more accessible, and easier to take action upon. To help make the data more accessible we created several custom reports like: Newsletter EffectivenessPage EfficiencyPaid Search AnalysisVisitor Acquisition Efficiency as well as some advanced segments based on the KPIs of the company.  

The new custom reports now make it faster and easier for Kaufmann Mercantile to monitor things like: 

  •  Which elements of the newsletter are creating sales and which are not?
  •  What is the acquisition cost on different sources?
  •  Which pages are generating sales?
  •  What pages should we optimize for SEO?
  •  Qualified Audience

AE: How did the Analysis Exchange process work for you?  What can you tell us about your experience with your mentor and student? 

Sebastian: In the beginning the three of us discussed our objectives for the seminar and what we should focus on. My mentor, Casper Blicher Olsen, then put together a schedule for the 3 weeks. Much of the work then happened between student and mentor and out of my sight. But we had about two phone conferences every week, where they explained what they were working on and the progress they were making. These conversations also allowed me to ask questions etc.

Working with the mentor and student was a great experience. It’s awesome to have two experts analyze your website and get feedback and information that would take weeks or month of reading if you want to figure them out by yourself. My knowledge on the subject increased tremendously in this short amount of time.  

 

AE: What useful insights into your website did you gain? Please be specific.   

Sebastian: As mentioned, the conversion rate was the main focus of our seminar. I found out that our product page, shopping cart and check out area had very high drop out rates and that we need to do some redesign in order to have less people leave our site or abandon their cart.  

I also learned about A/B testing and we actually did an A/B test on our site. We tested whether a red or a green “Add to cart” button would be more effective. Red won, which was the color we had already used.  

Another thing that I learned was about how you can track campaigns and their conversions in Google Analytics.  

Casper: Based on the analysis of the Web Analytics data we chose to work with two hypotheses: 

a.) Visitors don’t notice the “Add to cart” button, since it is the same color as the rest of the site.  

b.) Visitors are abandoning the checkout process because they get confused. 

Testing the product pages

To test the hypotheses we created a classic A/B split test, with one control and two variations of the “Add to cart” button. One with at green button and a bigger font, and another red button with a bigger font. The test winner ended up being the original small red button. Despite the result of the split test, the outcome was still a big success since they know which color button works best on the visitors. 

All three different versions of the “Add to cart” buttons: 

 

Since the analysis also revealed that users had a hard time figuring out the product pages, we used the knowledge from the Web Analytics data to redesign the existing product pages. With the new redesign it could be great to make a new test on all the product pages, to see if the redesign have any impact on sales.  

Our wireframe to the product pages can be found here:

 

 

Optimizing the checkout process 

The final recommendation we made for the company was to optimize their checkout process, since a big amount of visitors starting the checkout process didn’t complete it. To illustrate this we first created a blur version of the Cart page to see how visible the “Go to checkout” button was (see image below). Based on the result of the blur version and other relevant elements, the checkout pages was modified to a more user friendly version.   

Blur version of the Cart page: 

 

 

After Kaufmann Mercantile implemented the changes to the checkout and product pages, there has been an 22.9% increase in sales and a 28.9% increase in the conversion rate.  

 

AE: Would you do another Analysis Exchange project? 

Sebastian: Absolutely! I’m already in conversation with the mentor from the last project about what to do next.  

 

AE: What can you tell other small businesses/start ups about the Analysis Exchange? 

Sebastian: It’s going to be a rapid learning experience of how to use Google Analytics and how it can help you make decisions. 

 

AE: What advice can you give other companies on what they look for in students and mentors?  

Sebastian: Since it was my first project, I don’t have much of a comparison. My mentor was extremely knowledgable with online stores, which was a good fit. 

AE: What is your latest project at Kaufmann Mercantile?    

Sebastian: We’re currently in the process of re-designing and adding functionality to our product pages, cart and check out area. Once the design is “finished” I’d like to do some A/B testing on the new design and see how we can optimize. 

The following questions are from the student’s perspective.

AE: What was your primary driver for doing an Analysis Exchange project?

Brian: I wanted to round out the digital analytics training from my graduate study at Northwestern University’s Integrated Marketing Communications program.  The Analysis Exchange gave me an opportunity to work on an in-depth Google Analytics implementation with expert guidance.  The e-commerce aspect of the Kaufmann Mercantile project also seemed like a good fit for my background.

AE: How much of a time commitment was the project?

Brian: I was able to fit the Exchange project alongside a summer internship, so the time commitment was manageable.  Also, my project mentor generously offered me extra time outside of our official work together to discuss his experiences in the analytics field.

AE: What did you learn from doing the project?

Brian: I learned to focus on what can be tested and changed, how to put data in context, and how to quantify the impact of making changes on a site.

AE: What can you tell other students about the Analysis Exchange?

Brian: Digital analytics is not a spectator sport - you learn the most through practice.  The Analysis Exchange is a great way to get experience, and the mentors know their stuff.

Filed under measure nonprofit

405 notes

Public Media Stations Share Their Analysis Exchange Success Stories

On September 21, 2011, the National Center for Media Engagement and Integrated Media Association coordinated a webinar entitled “Insights From Metrics: Public Media and Analysis Exchange”.  Thank you to Steve Ley from KDHX in St. Louis, Chelsea Lund from Pioneer Public TV in Minnesota, and Trevor Clendenin from Virgin Islands Public Television for sharing your Analysis Exchange success stories and how you are using web analytics at your stations.  And Thank you to the National Center for Media Engagement and Integrated Media Association for providing a forum for us to share our story. If you are interested in hearing this webinar, it is recorded and posted on the NCME website here.

Filed under measure nonprofit