Once you have data in the Baseline system, there are a number of ways you can view and/or work with your data to create a report that fits your needs.
Visual Reporting Options
There are three formats in which you can view your data in the Reporting Site:
- Frequency/Tabular – This is the default mode. It shows you each answer option along with the raw number and percentage of respondents who selected that option.
- Graph – This allows you to view your data in graphical format. You can customize the settings for the graph, such as Chart Type (e.g., Bar, Pie, Area), Chart Values (Count, Percent, Mean), and other visual settings.
- Cross Tab – This allows you to view the number and/or percentage of students who responded in a particular combination based on their answers from two questions. For instance, you could see how many freshman males or sophomore females there are, or you could see the frequency breakdown of satisfaction ratings based on class year.
Basic Data Manipulation Options
Here are some basic options available to customize the way you view your data:
- Filter – This eliminates certain information based on specified criteria. You can filter your results based on:
- Selected answer to a question (e.g., if you only want to see how Freshmen responded)
- Date range for when responses were logged (e.g., if you want to view responses from the past week)
- Panels, which are predetermined groups of respondents whose responses you want to view based on some criterion outside of their answers to the survey (e.g., Student Government members, list of students who attended an event from Anthology Engage)
- Add/Remove Questions – In this tab, you can customize what questions are visible in your Results. Thus, you can customize your view so that you only see relevant questions. You also can add questions from other surveys that have common respondents, if that option is available to you. If you are uncertain, please contact your Baseline Consultant.
- Removing Questions – By default, all of the questions that are on a project are shown in the Reporting Site. If you were only interested in the answers to specific questions, though, you could select just those questions and eliminate “question clutter.” The data and questions will remain intact; you will just be hiding them from your view.
- Adding Questions – If you have access to Related Projects, you can select any question from one or more of those projects to include in the analysis of the current project. Any common respondents’ data will be available for analysis as though it were part of the existing project. For instance, if you have a project dedicated to student demographic data, you can add questions from that demographic project into your current project so you can use Academic Standing, Housing status, etc. as part of your analysis.
- Saved Views – If you customize the way you’re viewing your data, such as by applying filters or adding/removing questions, you can save this view and access it in the future without having to redo the customizations. Saved Views are dynamically linked to the data, so when you create a Saved View, it automatically updates as data are entered into the system.
- Exporting data – You can export your raw data into an external file for further analysis or reporting.
- Detail: This allows you to export all of the data from your project into Excel, Text, or SPSS. This is useful for conducting additional analyses on the data.
- View: This allows you to export just the data you are seeing. It literally takes the data as appears on the screen and exports it into Excel, Word, or PDF.
Other Reports
Much of your data analysis and reporting can be accomplished within the Reporting Site. However, there are three other types of reports that you can create in the Baseline system:
- Comparison Reports – These allow you to compare data from similar projects; each (scale) question is subjected to a t-test to determine if there are any significant differences across data sets. Some situations in which you may use this includes:
- Pre-Post tests – For instance, a pre-workshop survey and a post-workshop survey
- Different groups – For instance, comparing similar questions as answered on a faculty survey and a student survey
- Longitudinal studies – For instance, to compare similar questions from year-to-year
- Benchmarking Reports – These are similar to the Comparison Reports in terms of functionality, but since they are tied to national data, you can compare your institution’s data to:
- National Average – This is based on data from all participating campuses
- Peer group averages – This is based on data from a predetermined peer group (e.g., Carnegie classification, Conference affiliation) or a customized peer group of your choosing
- Anonymous peer institutions – This allows you to compare your data to a single institution from one of your peer groups. The data are anonymous, so you will not know which specific institution the data represent.
- Key Performance Indicators (KPIs)– These are snapshot metrics that allow you to quickly represent summary data graphically or textually. There are two types of KPIs:
- Static KPIs – These are manually-inputted data, often derived from sources outside of Baseline (e.g., Registrar information, CollegiateLink)
- Dynamic KPIs – These are metrics based on data that exist in Baseline. They are dynamically linked to data in Baseline, so as the data are entered into the system, the KPI automatically updates to reflect the new information. Dynamic KPIs can include data from the following sources:
- A single question within a single project – For instance, a campus climate survey may ask how likely the student is to return to the institution the following term
- A single question that appears in multiple projects – For instance, all Housing surveys may ask the student how accepted s/he feels in his/her residence hall
- Multiple questions within a single project – For instance, a Dining Services survey may include series of questions related to satisfaction with various aspects of the facilities (e.g., space, cleanliness, organization, proximity), and combine them into one metric for “Satisfaction with Facilities”
- Multiple questions that appear in multiple projects – For instance, asking students to what extent they met the learning outcomes for each workshop in a series and combining all learning outcome-related questions into a general “Learning Outcomes” metric.