Core Competency 3 — Mathematical Concepts
The following data was collected by Alpena Community College as part of its ongoing student outcomes assessment process. (Last updated March 22, 2022.)
Student Outcomes Full-Time Faculty Participants (Distinct)
Fall Semester 2020 | 3 |
Spring Semester 2021 | 2 |
Fall Semester 2021 | 5 |
Student Outcomes Course Assessments
Fall Semester 2020 | 3 |
Spring Semester 2021 | 2 |
Fall Semester 2021 | 5 |
Student Outcomes Course Assessments Meeting Benchmark
Fall Semester 2020 | 13 |
Spring Semester 2021 | 4 |
Fall Semester 2021 | 5 |
Student Outcomes Course Assessments Percentage Meeting Benchmark
Fall Semester 2020 | 100.0% |
Spring Semester 2021 | 100.0% |
Fall Semester 2021 | 83.3% |
Core Competency 3 Comments and Action Plans
Faculty reviewed results and made the following comments and action plans:
Fall Semester 2020
- None needed.
Spring Semester 2021
- Celebrate, despite COVID a majority of the students achieved the goal.
Fall Semester 2021
- Examine final exam to determine if certain problems did not get enough practice time during the semester. Unfortunately, the two students that did not meet the passing score did very little homework.
- No action required at this time. Will continue to monitor for trends.
Historical Data
October 2015 Questions and Results
My suggestion was to consider having the students take the Core Competency Surveys as part of their graduation application process. You would have a random sample. You would be assessing students at the end of their time here so have the greatest possibility of the student being in possession of the required skills. You should have 100% of surveys completed with no abandoned attempts. This method would also have the incentive to the student included at no additional cost to the school.
_______________________________________________________________________________
I wanted to suggest that a possible sample of students for the initial Core Competency Survey would be students attending the mandatory orientation. Not only would they be a captive group, but it would also be at the beginning of the students’ academic work at ACC. Since the survey does not take an inordinate amount of time, it seems as though it would be easy to fit in with the orientation schedule. Since it is a random sample, it could be the last activity when the group is together for those selected. The idea of a beverage incentive for each student participating sounded great.
I was also wondering if it would be possible to create an app for the exit-survey version so that students could take it on their telephones?
One suggestion regarding the assessment of core competencies would be to make the question process a part of mandatory orientation as a beginning control group and then administer again at end semester of their program for comparison of outcome of improvement. I understand some of the perspectives of is it a good representation of truth or are they just completing to get it done- but that could be viewed in any completion just to get the cup coffee.
Another area I suggested was to get the student’s perspective on how to get good responses.
Here are my comments:
- May want to provide more of an incentive for the students to complete the competencies. This may improve upon the participation rate.
- Consider having the students complete the assessment in the Testing Center. The instructor can then monitor participation and provide a few extra-credit points for a class.
- Have the assessment completed by second-year students.
I mentioned a couple of things during start-up on the outcomes assessment data. They were:
- Using a small incentive for all students instead of one large chance drawing for one student to win. Example – free soda or drink from the bookstore. Maybe it won’t make a difference, but it might be worth a try.
- When looking at data results, we should break results into two groups:
- All students
- Students that have attended a minimum of three semesters – closer to graduation.
- Cleaned up the display problems for questions 9 & 10.
*In the first round of the survey, there were display issues with formula in two of the questions: questions 9 and 10. These issues were resolved when reissuing the survey for the second round. - Work on institutionalizing the use of the student’s college email for notification of the test.
- Had faculty bring their classes to the library to take the assessment, rather the leave it up to the students to take the test on their own.
D. April 2017 – Re-assessment Survey (139 Responses)
April 2017 Questions and Results