How to Analyze Your Data with the ISIR Analysis Tool - PAGE 2
1. Excerpt from the User Guide: ISIR Analysis Tool 2008-09, page 50
2. Excerpt from the User Guide: ISIR Analysis Tool 2008-09, pages 49 and 53
3. Excerpt from the Hands On Advanced Workbook from the 2008 QA Pre-Conference, pages 10-12
4. Excerpt from the 2008 QA Pre-Conference Hands On Advanced Workbook, page 14
5. Excerpt from the 2008 QA Pre-Conference Hands On Advanced Workbook, page 14
6. Excerpt from the 2008 QA Pre-Conference Hands On Advanced Workbook, pages 17-21
7. Excerpt from the 2008 QA Pre-Conference Hands On Advanced Workbook, page 11
8. Excerpt from the User Guide: ISIR Analysis Tool 2008-09, page 45
9. Excerpt from the User Guide: ISIR Analysis Tool 2008-09, page 47
10. Excerpt from the User Guide: ISIR Analysis Tool 2008-09, page 49
11. Excerpt from the User Guide: ISIR Analysis Tool 2008-09, page 49
12. Excerpt from the User Guide: ISIR Analysis Tool 2008-09, pages 57 and 60
From the Domain Tree of the ISIR Tool:
- Select IA Tool 08-09 (left click on the mouse).
- Click the Standard Report (left click on mouse) Icon from the left navigation bar and select IA Tool 2008 2009
- Select 5. Field Increment Report. If you receive the pop up box "Do you want to display the Non Secure Items", select "yes".
- The Filters page will display, including this report’s exclusive option to "Select a Field for Increments." You must select a field here. See table 3, on the next page, for a complete list of available fields. Also select any desired filtering criteria.
- Click Submit
When looking at a Field Change Report, you have access to two additional reports via the "drill down" functionality of the Tool. Both of these reports can help you target education or verification efforts.
First, you can look at the distribution of the records experiencing a change to the indicated ISIR field across value ranges of that same or other ISIR fields. This is done by clicking on name of the desired ISIR field. This will launch the Field Increment Report using data from only those records that experienced a change to the selected ISIR field. As we describe in more detail in the next section, this report shows you whether or not changes to the selected field were concentrated in a particular value range or ranges. This information allows you to target educational or verification efforts only on the most problematic range(s).
Are changes to a particular field concentrated within a particular value range of that or some other field?
When you drill down to the Field Increment Report from the Field Change Report, your results will display the distribution of all corrections to a single ISIR field. You interpret the findings just as described above, but they reflect only the changes to the field you selected.
You can use this information to help target verification or education efforts. When analyzing the changes detected by your school’s current verification, you can look for value ranges where corrections to ISIR information are rare or where it rarely affects EFC. If practical you may want to consider excluding students within these ranges from future efforts.
Schools participating in the Quality Assurance Program can use the Field Increment Report to find value ranges with high percentages of corrections that affect EFC among records they typically do not verify. Schools should consider adding the most problematic ranges to future school verification efforts.
From the domain tree of the ISIR Tool:
- Click on Shared Reports folder.
- Go to + Brigham Young University and select + Custom Reports.
- Right Click on "School Verification Analysis" and select "Save As My Report" and select "OK."
These "reports" will produce counts of specific types of students by whether or not they were selected for verification and whether or not they experienced a "major change" in eligibility for need based aid. As explained below you have the option of setting your own threshold as to what change in EFC constitutes a "major" change.
Cutting and pasting the output of these reports into Excel templates we have prepared will produce a series of 24 graphs that allow you to "see" your results in a format that we hope helps you identify areas in verification procedures to tweak. As explained in more detail below these templates do assume that the information is in a specific order, therefore it is essential that you not modify the content of the report. Your data uploaded into ISIR Tool must contain at least some records that are dependent students and (this sample year) some records not normally selected for the verification. Again other than the labels they apply to the output the two versions (School and CPS) of the two templates are the same. During this training we will work exclusively on the School verification version. If you are so inclined, you could follow the exact same steps for the CPS verification version when you get home.
- Go to My Reports and click on custom reports then left click on School_Verification_Analysis
- This will run the report and generate an Excel workbook.
- If you get a warning box asking for instructions on how to handle the downloading of the file, click OPEN.
- Left click in the box next to column A - this will copy entire sheet.
- Open the "School Verification Graphs" Excel document from the Desk Top.
- Paste the information generated by the School_Verification_Analysis Report by placing the cursor on A1 in the "Paste Data Here" tab of the School Verification Graph Excel template, right click and select paste.
- Click through the tabs of the bottom of the School Verification Graphs Excel Workbook to review your 24 graphs.
- Save the Excel spreadsheet.
- The report uses the AAA_COST_OF_ATTENDANCE field as a selection criteria; selecting only students who have an EFC (initial or paid on) less than the specified value. By default this value is set to $17,336 - the national average for in-state public four-year students in 2007-08. To change this highlight AAA_COST_OF_ATTENDANCE and click on the Edit Define Field button. Change the value to reflect the average cost of attendance at your school or some other meaningful threshold, e.g., your school’s EFC threshold for packaging grant aid or a maximum EFC for eligibility for a state scholarship. Click OK.
- By default the report will define as a major change in aid eligibility any change to a Pell award or an absolute value change to EFC in excess of 400. If you would prefer a lower or higher EFC threshold, highlight AAA_MAJOR_CHANGE_DEF. Click on the Edit Define Field button. Change the value to any positive number you like. The higher you set your EFC threshold, the more exclusively your results will reflect only changes to Pell awards. Click OK.
There are two types of graphs contained in the "Graph" template. The first type denoted by D# or I# tabs, graphs the percent of records in the indicated category of students who:
- Were selected for verification and experienced a major change in aid eligibility - Black
- Were selected for verification and did not experience a major change in aid eligibility - Light Blue
- Were not selected for verification and experienced a major change in aid eligibility - Dark Blue
- Were not selected for verification and did not experience a major change in aid eligibility - White
We have designed this first type of graph to make it easy for you to identify the types of students your current verification procedures may be missing (dark blue) or may be selecting unnecessarily (light blue). These are the graphs you should concentrate on. The goal of your analysis is quite simple; "get rid of the blues" by modifying your verification efforts to target (turn dark blue to black) or eliminate records without a major change form your verification efforts (turn light blue to white). For training purposes we keep our discussion of these new graphs fairly simplistic. That is, we talk about how to interpret the results in isolation. In practice we encourage you to use these reports in combination with other information to apply more nuanced modifications to your verification procedures.
If blue is "bad" than perfect results would be like TV in the 1950s, just black and white. Note that the accuracy of your students’ initial application is the only thing that determines the percent your school "should" verify.
It is probably not realistic to assume that any school would ever be able to achieve 100 percent accuracy in all the verification selection criteria. Don’t be discouraged if a little blue remains.
The first step in analyzing these results is to identify categories of students who are the most "blue." It is probably easier to do this sequentially. First identify the types of students with the highest percents not selected with a major change (dark blue) and then go back through to identify which students had the highest percent of selected without major change (light blue).
After you identify the students most likely to be missed by verification (dark blue), you need to consider whether or not it makes sense to add ALL students in the identified category to your verification criteria. This depends, of course, on the total percentage of students of this type who experienced a major change to aid eligibility. Consider the percent of records that were selected for verification with a major change (black). If there is high percent of major changes not being selected and a high percent of major changes that are being selected, it might make sense to go ahead and verify the entire group. See an illustration of the sliding scale you could apply to your own results below. When your results fall in the "maybe" region try to think of other information you could use to distinguish the subset of students that need verification from those that do not.
Likewise after you identify the students most likely to be selected unnecessarily by verification (light blue), you need to consider whether or not it makes sense to simply exclude ALL students in the identified category from verification. This will depend on the total percentage of students of this type who didn’t experience a major change. An easy way to see this in the graph is to consider the percent of records not selected for verification without major change (white). If there is high percent of no major changes not being selected and a high percent of no major changes being selected, it might makes sense to go ahead exclude the entire group. See an illustration of this below. When your results fall in the "maybe" region try to think of other information you could use to distinguish the subset of students that do not need verification from those that do.
While we encourage you to focus on the graphs that display the percentage breakdowns of verification selection and major change status, you also need to consider the number of records your findings are based on before you make changes to your verification criteria. The second type of graph, denoted by D#N or I#N, are stacked bar charts that represent records selected for verification are represented in black and those not selected are indicated by white. Here is an example of this type of graph.
Generally speaking the larger the numbers of records, the more confidence you can place in the results. As a minimum, you should have at least five applicable records that you selected for verification before you use your results as a basis to reduce the scope of current verification efforts. Likewise, you should have at least five applicable records that were not selected for verification before you consider expanding verification based on the corresponding finding. So if there are four or fewer applicable records we recommend foregoing any analysis. When there are 5 to 24 records, take you results with a grain of salt.
- IMPORTANT - BUT RARELY AN ISSUE - make sure that that the EXCEL output contains all the combinations of SELECTED_BY_SCHOOL and MAJOR_CHANGE in the order indicated below. The vast majority of schools should have all their bases covered. Note there are 2 merged rows between the field name and the values. If any row is missing, simply use Excel to insert a row in the right location(s). You do not need to add any content to the row. A blank row will correctly position whatever data are there.
- JUST AS IMPORTANT - BUT EVEN MORE RARE - You must have some dependent students in the data uploaded in the Tool for the information to be in the right location for the graphs produced by the templates. In the unlikely event that you have ONLY independent records uploaded and want to run the shared report and use the templates, you must highlight cells 1C to 1AC, Select insert, and choose columns. This will correct the placement of data when only independent records are present.
- From the Domain Tree, Select IA Tool 08-09 (left click on the mouse).
- Click the Standard Report (left click on mouse) Icon from the left navigation bar and select IA Tool 2008 2009.
- Select 4. Field Change Report. If you receive the pop up box "Do you want to display the Non Secure Items", Select "yes".
- On the Filters page select any desired criteria. In the example report that follows, under Dependency Status we selected: Dependent and under CPS Verification Flag we selected: Yes.
- Click Submit.
- By default the Field Change Report displays changes to all ISIR fields. See the screenshot below.
- Enlarge the Verification Summary Report on the Dashboard or generate a Verification Summary Report under the Domain Tree. See the previous section of this chapter that explains how to generate this report.
- Select a subset of students represented by a single bar or entry in the text data table. Click on that bar or number.
- The initial version of the Field Change Report will be generated using only those records that were represented by your selected subset of students.
… look at the distribution of the records experiencing a change to the indicated ISIR field across value ranges of that same or other ISIR fields. This is done by clicking on name of the desired ISIR field. This will launch the Field Increment Report using data from only those records that experienced a change to the selected ISIR field. As we describe in more detail in the next section, this report shows you whether or not changes to the selected field were concentrated in a particular value range or ranges. This information allows you to target educational or verification efforts only on the most problematic range(s).
...you can see how many of the records experiencing a change to an indicated field were dependent vs. independent and initially eligible for Pell vs. initially ineligible. This is accomplished by clicking on any of the counts or percents in the Field Change Report. This second drill down option opens the Dependency and Pell Eligibility Status Report. As we will describe in the section after the Field Increment Report, you can use this information to target education or verification efforts. You can also use this report to assess the percentage of students selected for School or CPS verification in each of the four dependency and Pell eligibility defined groups.
Note that the number you click on to drill down to Dependency by Pell Eligibility Report matters. Only records satisfying all the conditions implied by the Field Change entry will be included in the subsequent report. For example, if you select an entry under either the Records Corrected (# or %) columns, all records with the associated change will be included. However, if you select an entry under the EFC Increase column, only those records with the indicated change and an EFC increase will be brought forward.
By clicking on any of the blue counts or percentages in the Pell Eligibility by Dependency Status Report you can drill down to the Student Listing Report. This report provides a list of individual students including: identifying information; a summary of the change to EFC and Pell; and the student’s School and CPS verification status. You can use this information to identify individual students to review as part of your analysis.
By clicking on the social security number of a student (in the Student Listing Report), you can drill down to the Student Detail Report for that student. This report provides a complete listing of all the changes to ISIR information that occurred between the initial and paid on transaction for that student. As discussed above, this is the information you use to compare students with different verification outcomes. The main purpose of the Student Listing Report is to make it easier for you to find the right Student Detail Reports to view.