Laura Quinn's blog

Data Visualization Tools - An Early Preview

Here at Idealware world headquarters, we're working on a report on Tools to Graphically Depict Data on a Shoestring (I know, the title needs some work). We're still very much doing research and writing, but we've mapped out a pretty decent view of the tools that are available in this space, so I thought I'd share and see if you know of any I'm missing.

Here's what I've got, for tools that will help you display quantitative data in a visual form without a lot of time, money, or specific skills:
  • Excel: the obvious one. It's quite a flexible and complex tool compared to the others (though those go together -- it's flexibility is so obscure and complicated that many don't know it's there), but it doesn't make it easy to publish graphs online or even in polished printed form.
  • Google Docs: nice features for both simple and more interactive graphs, and pretty polished graphs, though very little control over the look of them (check out both the Charts and the Widgets features). All can be easily embedded. Free.
  • ManyEyes: the best known of the online visualization tools, with a lot of great format options, and pretty professional looking (though again, very little control over the look). You must publically publish your data with ManyEyes in order to use the tool. Free.
  • DabbleDB: lets you create nice, simple graphics from data; simple and easy. Free if you share your data; $8/user/ month otherwise
  • Swivel, iCharts, WidGenie: all online tools that let you easily create charts from data, and then publish them. We're still researching them, so I don't know as much about them.
Those are ones you don't need a programmer to use; if you've got a programmer, consider FusionCharts or Chart Director as coding language plug-in libraries, or the Google Visualization API, Yahoo Charting API or Open Flash Charts. Or if this is going to be a big part of what you do, consider R or Processing as visualization/ stats specific programming languages. (tip 'o the hat to Chris Mulligan at YouGov for the Yahoo API and Open Flash Chart)

What else is out there? What have I missed?

"Listening" vs. "Asking" to Find Out What People Think

I'm doing a bit of research of late about using online tools to find out what people think about your organization. It's an interesting area - there's a vast number of tools (many very similar to each other) that can help you monitor and listen to what people have to say online and a big body of useful best practices and case studies about how to use them. (By the way, Beth Kanter's information in this area is even more useful than her usually very useful stuff!)

I have a substantial background in research - traditional ways to find out what people think, like surveys and interviews - and I can't help but notice that there's almost nothing that I can find connecting the "online listening" area to more formal research techniques. There's got to be overlap there, right? There's likely good lessons to be shared between them in both directions.

A couple that jump to mind: I think that often formal research overlooks the idea of listening to what people say on their own without the presence of a researcher (the online world makes this so much easier), which is certainly a useful thing to do. In the other direction, I think there's too little discussion in the writings and posts in the "online listening" world of what it means that people are saying things without being asked. The stuff that they say is certainly worth hearing (and you certainly can't ignore it), but you also need to keep in mind that you're likely not getting the full picture that way. The people talking on their own are going to be the ones with strong opinions, so they're unlikely to be typical of your average constituent... if that's what you want to know.

I think this spectrum of "listening" to "asking" is a pretty useful one to consider. Both are important to find out what people think about you. In fact, add in an "informal" to "formal" axis, and you've got a nifty chart:



(By the way, this is not intended to imply that research is "best" because it sits in the typically best upper right quadrant.... this is simply the order that makes the most conceptual sense, I think)

Resource Roundup 5/21

Lots of terrific resources released recently...

eNonprofit Benchmarks Study 2009
If you're not familiar with the eNonprofit Benchmarks reports, you should be. They're fabulous, with a ton of useful benchmarks as to what you can expect in terms of email, online fundraising, and other online metrics based on actual research. They've just released the 2009 version, which has a number of new areas of exploration as well as updates on the old.

Online Seminar Series: Client and Service Management Software for Human Service Organizations
NPower Oregon is doing a really interesting series of online seminars about Client/ Service/ Case Management systems. In a five seminar series, one per week starting on June 3rd, the terrific Shawn Michael will help you identify your needs, evaluate software choices, and plan for implementation - including substantial demos of Service Point by Bowman, Client Track by DSI and Social Solutions by ETO.

Should you drop your membership amount?
As always, it depends. But M&R; will tell you what it depends on in their new whitepaper.

Social Networks for Nonprofits: Why You Should Grow Your Own
I'm leery of telling nonprofits they should develop their own social netoworks, as in my experience far more build them than succeed with them. But if you're planning on it, this report has some interesting insight and tips.

Getting Good Data from Informal Surveys

I'll admit it: I'm a research geek. I really care a lot about tings that most people don't, like methods of data analysis and obscure types of bias. But that said, I also think that people should care a lot more about research methodology than they seem to. If you're going to be acting on the results of research, or particularly if you're going to conduct it, there's some basic tenants you need to know.

Take, for instance, informal surveys. There's lots of these coming out every month, and they're easy to do: slap some questions together in SurveyMonky, mail it to a discussion list, and you've got data. But not so fast. Just because you've gathered it doesn't mean it can actually tell you anything.

The main issue to keep in mind for any informal survey is response bias. If you're surveying a specific, limited population (say, only members of an organization, or people who have used your services), carefully craft your survey and approach, ensure you only get one response from each person, and 50-60% of everyone you try to survey responds, you might not have to worry about response rate. Otherwise, it's a huge concern. And yes, that's almost every survey that mere mortals might do.

Response bias means that your data is skewed towards those who chose to answer your survey - typically, those more emotionally invested or interested by your topic. It means that your data doesn't represent any larger population, but only those who choose to answer.

Let's say I want to find out about pizza. I put together a survey, and send it out to few mailing lists with a note "Please take our pizza survey!" A few days later, I tally the data, and amazingly, it turns out that everyone loves pizza as much as I do. 90% of everyone loves pizza! I've discovered a new trend! But no. This is an example of response bias. What I've actually found out is that 90% of people who were motivated to fill out survey about pizza like pizza. A lot less interesting, huh? Those who don't care abut pizza or thought it was inane to do a survey about it or didn't feel like they knew much about pizza didn't respond at all.

Importantly, it doesn't matter how many people I get to fill out the survey. I could get a million people to fill it out and it would be exactly as biased. My 90% figure would still be fatally flawed.

But even though my survey is biased towards those interested in pizza, I could still get some interesting data. I could, for instance, gather some data about toppings - it would be unscientific but interesting to find out that 20% of my respondents enjoy peperoni, while only 10% enjoy mushrooms on their pizza. I wouldn't bet the farm on this data - there's no way to be certain that the lists I posted the survey to aren't somehow skewed towards peperoni lovers, or followed diligently by a peperoni lobbyist who stacked my results - but it's certainly not fatally flawed in the same way.

So what does this all mean? Some tips:
  • Be suspicious of sweeping demographic conclusions that have been reached based on anything but big, carefully designed studies
  • Look for the methodology. Any reputable survey should give a sense of who they reached out to, including some ballpark number of people and a sense of the response rate.
  • Useful surveys are hard to design. Please find someone who can help design a process that will provide reasonable data. Bad data can be more than useless - it can be misleading.

Live Demos of WordPress, Joomla, Drupal, Plone!

For anyone who's ever asked how WordPress compares to Joomla, or Joomla to Drupal... we have your answers in demo form. It's the return of the Open Source CMS webinar!

TOMORROW (Wed) at1-2:30 Eastern, Idealware's conducting the online seminar Comparing Open Source CMSs: WordPress, Joomla, Drupal, and Plone, for a $40 registration fee. View more or register now>

We'll certainly talk through some of the information from our recent report on the same topic, but we'll spend most of our time demoing the systems and answering your questions. Heather Gardner-Madras (who is not only a blogger extraordinaire, but has actually implemented all four of these CMSs) , will show the real differences between the systems. We have a somewhat different structure for demoing than in our old one as we're interested in really honing in on the differences between the systems - she'll focus on the key elements that make up a site in each different system, and how those make a big difference in the flexibility and ease of setup in each system.

Hope to "see" you there

CRM on a Shoestring

Constituent Relationship Management (CRM) is an important concept for nonprofits. The idea is to have all your information about all your constituents together in one place, so you can see the full picture of each person's interaction with your organization. Instead of having all your donors in one system, your volunteers in another, and your event registrants in a third, you have everyone in a single system... or at least a way to sync up those systems. You can then see that, for instance, Joe Smith has both volunteered and registered for several events, and might be a good prospect as a donor.

It's a great concept, but it can be hard to implement (as Paul Hagen talks through so well in his two part CRM series). And it's particular hard to implement for an organization that's strapped for cash. What should your approach be if you're trying to get up and running with CRM on a shoestring? I see three possibilities:

Look to lower cost out-of-the-box integrated solutions
If you have a set of fairly common needs, and don't need really deep functionality in any one area, there are in fact tools that cover a wide variety of different types of interactions out of the box. Our recent Low Cost Donor Management report (published in partnership with NTEN) covered a number of them - for instance, Neon by Z2, Community Enterprise by CitySoft, Total Info by Easy-Ware, and Salsa by Democracy in Action each could be an interesting fit, depending on your needs. The idea here would be to carefully understand the full range of interactions that you're trying to support, and then evaluate the software to see if it fits. This approach won't work for everyone, though - some will find that the there's really no system that does everything they need.

Choose a system that specializes in a specific area, and configure it for others
If you have fairly deep needs in a particular area, it might make more sense to look for a system that specializes in that area, and configure in less robust functionality in other areas. For instance, if donor management is a key priority, you might choose a fairly configurable donor management system, and use add-ons and custom fields to support other interactions (Our Donor Management report talks through a number of options here). More and more systems have pretty useful custom field setups (for instance, a number allow you to log a number of linked pieces of data about a particular action - like the date, number of hours, and description of volunteer participation). This approach will yield you deeper support in one area and less sophisticated support in others - but that could make sense if that mirrors your organizational priorities.

For the time rich but money poor: CRM Platforms
The idea of using a toolset, like Salesforce or CiviCRM, and configuring it carefully to meet your needs is gaining steam. This approach can yield great results, but typically requires a sizable investment of time from someone technical to configure the platform to meet your needs. They're intended to be tailored, so often provide less out of the box. I'm nervous about this as a shoestring approach, mostly as it appears to be cheaper than it is -the costs are often hidden in setup time and maintenance. It's easy to be drawn in by the lure of a powerful system for free (neither Salesforce of CiviCRM cost anything for most organizations to acquire), and be sucked down the rabbit hole before you realize what you signed up for. But for organizations with technical staff members or volunteers who can devote the time, this can be a great low-cost approach.

What do you think? Are there other approaches I haven't thought of?

New report: Consumers Guide to Low Cost Donor Management Systems

We've did a soft launch of our most recent new report at the Nonprofit Technology Conference - but we're now eager to spread the word far and wide! If you haven't yet heard, Idealware in partnership with NTEN and NPower has just published A Consumers Guide to Low Cost Donor Management Systems (free registration required).

This report summarizes a huge amount of research - we took a look at 33 different donor management systems that cost less than $4250 in the first year. The research is broken up into two different actual reports. The first, the Consumers Guide, outlines the functionality that donor management systems provide, summarizes each of the 33 systems, recommends useful systems for each of a set of specific scenarios, a high level comparison of 10 systems, and lists consultants who can help you select or implement software. The second, Detailed Reviews, provides six to eight page reviews of each of twelve different systems.

Check it out, and pass on the word! It's available for free (though registration is required) at www.idealware.org/donor or www.nten.org/dms_report

Resource Roundup 5/7

Okay, it's been awhile since I've done a resource roundup for the blog - too long!

The Pros and Cons of Skype for Business (Small Business Computing)
An even handed look at how well Skype - a free service to make phone calls and video calls over the web - works in business situations.

Unified Communications Options for Nonprofits (TechSoup)
Useful overview of options to manage multiple communications methods - so to forward office to cell phones, or have voice mail show up in email

Washington Post: Fundraising Via Email Is Way More Effective than via Social Networks (FrogLoop/ Washington Post)
The Washington Post posted an article about the efficacy (or lack of) of fundraising on Causes; FrogLoop offers a useful perspective on it.

Measuring Engagement and Return on Relationships (Beth's Blog)
Beth Kanter rounds up a bunch of resources that can help you think about measuring the retun on investment for engaging constituents.

Ease of Use for Novices vs. Experts

As we're sprinting ahead on our Consumers Guide to Low Cost Donor Management Systems, one of the interesting aspects that we're considering is the ease of use for the novice vs. ease of use for the expert user.

I'm finding that not many people have really heard of the idea of expert ease of use, but it's a really important one in the usability realm. And it's particularly important when you're considering systems that you'll be using a lot. For instance, a development staff person might use a donor management system hours per day.

Most people think of "Ease of Use" as measuring how easy a system is to learn - how intuitive is the layout and terminology? Can you figure out where you're supposed to go? Are complicated things simplified with wizards and multi-step processes? Will everyone need training? That's ease of use for a novice.

The idea of expert ease of use is to measure how much the system supports the work of people who already know the system well. Are repetitive tasks made easier? Are they fast, or do they take a million clicks (things like wizards are often a downside here, as they get in the way of someone who knows just what they want to do). Do things generally seem to work the way they should, or are you always having to do weird hacks and work-arounds?

For instance, consider Adobe InDesign. I love InDesign, but no one would say it's easy to learn. Even if you know Photoshop or Illustrator, there's a ton of complexity and different functionality, as well as a whole other mental model. I always teach myself stuff from books, but with InDesign I'm stumbling and wishing I had taken a class.

But it's a miracle once you learn how to use it. Everything just seems to work just right. Page numbers? Done. Floating image with a caption on top of other text? Done. Need to bold the first sentence of every bullet point for forty pages of bullet points (a task close to my heart)? Like five clicks, and it's done. It's not optimized for the novice. It's optimized for the expert user - which completely makes sense for a tool like this, that many people use pretty much all day every day.

Contemplating Open Source CMS Security and Market Share

A couple of people, Four Kitchens for instance, have suggested that our analysis of security in our new report Comparing Open Source CMSs: WordPress, Joomla, Drupal, and Plone is less rigorous than it could have been.

First off: yes, absolutely. It could have been more rigorous. That's true of pretty much anything in the report. In fact, the major art of doing a review like this is trying to figure out how to do a useful analysis that is achievable in a human lifetime. There's always more to know, to analyze, to drill down into. So there's no question that there's more to say about security than we said. If anyone wants to do an analysis that factors in severity and response time and and history of actual exploitation, as Four Kitchens suggests, I'd love that. We'll use it in the (hopeful) update of the report. It's way beyond our current scope and budget to do.

However, folks have also suggested that the primary metric we used - vulnerabilities reported by Security Focus - isn't valid. There, I disagree. It's a rough measure, no doubt, but a useful one. The main criticism is that more popular systems have more eyes on them to generate more vulnerability reports. That's absolutely true. But the opposite is also true - there's more evil black hat folks trying to crack more popular systems, to take advantage of vulnerabilities. Michelle Murrain, our lead researcher on the report, says a lot more smart things than I could on this topic on her own blog.

And in fact, the differences are notable. Plone has two vulnerabilities reported, while all the other systems have more than 25. And the Plone community was able to give us a lot of reasons why that was. In a report like this, there's always a bit of a smell test going on. Do the numbers seem reasonable? Do they agree with what we're seeing and hearing as we talk to people? In this case, they definitely do. From all accounts, Plone is a system that was built with security as a priority. And the fact that it runs on an unusual environment makes it more of a pain to hack - and thus less likely to be hacked. And with all of that, what's our rating? Plone gets an Excellent, while everything else has a Solid. Hardly a stinging indictment.

By the way, David Guilhufe also had a few comments about our Market Share analysis (buried as the last Appendix, so David gets a gold star as a careful reader). Yeah, I'm not going to hold that up as a paragon of market research. It's shockingly difficult to find any useful numbers that one can compare across systems - downloads? users? developers? Nope. Our main goal with the analysis was not to actually compare the popularity of the four systems we reviewed (and you'll notice, we didn't do so anywhere in the report), but to show why we choose those four systems as opposed to, say, Typo3 or Movable Type. And there the four stand out pretty well in the nonprofit market. David mentions that WordPress should be the most dominant - I don't know about that. For nonprofit websites, as opposed to blogs? That's what we were trying to assess...
Syndicate content