Blog Insights
Tools for Nonprofits to Effectively Manage and Analyze Their Data
When people think about data, the concept of data visualization (aka., dataviz) is often the star that gets all the attention. But in many ways, dataviz is just the visible tip of an iceberg of data work. In order to efficiently and consistently produce high-quality data visualizations — whether in health, education, poverty, or any other important issue – mission-driven organizations need to first have platforms in place to manage and analyze their data.
When considering how to approach data management, a modern data platform, or “stack”, needs to do four things really well:
- Get the data
- Store the data
- Model the data
- Analyze the data
To follow these steps, many researchers and policy analysts have had to rely on outdated approaches, such as spreadsheets, shared drives, and internal databases of varying sophistication. The great news is that new approaches and affordable tools are changing how nonprofits can manage and analyze data throughout this process.
1. Tools to Help You ‘Get the Data’
The first step in managing data is consuming and transforming it. How can you get data into your platform for future analysis? In the old days, getting data out of one system and moving it into a data warehouse or central data store required ad-hoc scripting, couldn’t accommodate real-time connections and updates, and was really difficult to manage over time.
There are a number of products available today designed specifically to take the pain out this “extraction” of data. These “data-piping” tools allow you to connect to a myriad of cloud-based services and data sources, extract data in real-time, map and transform this data, and then move it to a cloud-based data store (which is the next “layer” in our modern data stack).
We recommend tools such as Alooma, Segment, Fivetran, and Matillion in this first stage.
2. Tools to Help You ‘Store the Data’
Once you’ve got your data, you need to keep it somewhere in the cloud where you can continue to add additional or updated data as it comes in. Over the past 8 years, the commercial sector’s focus on big data and data processing has had a huge impact on the availability and affordability of cloud data warehousing. What used to be expensive and hard to set up and manage is now much easier and affordable.
Tools we recommend at this stage include Google Big Query, Amazon Redshift, and Snowflake.
3. Tools to Help You ‘Model the Data’
Once you’ve been able to centralize and store your data, you are now ready to put it into a format that’s going to allow you to start answering the questions your organization is asking. A modern data platform provides analysts at your organizations with tools that let you define data dimensions, measures, calculations and aggregates that can be shared by your entire organization for querying and exploring.
Tools we recommend at this stage include Looker, Superset, Mode, and Metabase.
4. Tools to Help You ‘Analyze the Data’
This is the step where you can really dig in, ask various questions of your data, and show and share insights. This ‘layer in the stack’ has a number of established known players (e.g., Tableau), but a number of ‘new breed’ tools are becoming very popular because of their ease of use, their web-native architectures, and their newer approaches to modeling and sharing access to data, dashboards, and visualizations within organizations.
Tools we recommend at this stage include Looker, Tableau, Domo, Superset, GoodData, Periscope Data, and Metabase.
More than ever before, the above tools are allowing a mission-driven organization to take hold of their data in a faster, easier, and more affordable way.