Graduate Program Outcomes Data

Results of Assessing Graduate Student Progress

We assess student progress on learning objectives through an annual process.  First students self-evaluate their progress on learning objectives. Then the student’s Advisory Committee conducts the same assessment in the context of a committee meeting. This is required of all students on Graduate Assistantships, but not for students graduating from the program the same year.

2025 Department of Geosciences Annual Graduate Program Assessment Table

(12 student forms submitted)

2025 numbers

2024 Department of Geosciences Annual Graduate Program Assessment Table

(8 student forms submitted)

2024 numbers

2023 Department of Geosciences Annual Graduate Program Assessment Table

(6 student forms submitted)

2023 numbers

2022 Department of Geosciences Annual Graduate Program Assessment Table

(11 student forms submitted)

2022 numbers

Graduate Degrees Granted from the Department of Geosciences

Year MS
(thesis-based)
AEG-MS
(coursework-based)
PhD
                       
Total

2015-2016

7

1

1

9

2016-2017

8

3

0

11

2017-2018

8

3

1

12

2018-2019

7

1

0

8

2019-2020

2

0

0

2

2020-2021

4

2

2

8

2021-2022

4

1

0

5

2022-2023

6

0

1

7

2023-2024

8

2

2

12

2024-2025

7

3

0

10

Total

61

16

6

83

Employment Outcomes

Tracking of students after they have completed our program indicates the high success of graduates obtaining employment in the geosciences and in being accepted into further graduate programs.

Students who graduated during the interval noted have the following post-graduation careers:

pie chart of post graduation careers

Job 2009-2016 n 2017-2023 n
Energy 34% 15 4% 2
Mining 2% 1 2% 1
Government 11% 5 14% 7
Environment/Geotechnology 23% 10 38% 19
Education 20% 9 12% 6
PhD/Post Doc 9% 4 20% 10
Other 0% 0 10% 5

Department Response to Assessment Results (2022-2025)

Summary of Assessment results from above for discussion below.

summary of numbers

Observation = Foundational Skills Learning Outcome
  • Similar assessment between students and committees
  • Most students and committees assess foundational skills ‘at’ where the student should be
Observation = Research Skills Learning Outcome
  • Most students and committees assess student research skills ‘at’ or ‘above’ expectation
  • Students tend to rank their skills lower than the committee assessment
Observation = Communication Skills Learning Outcome 
  • Students tend to rank their communication skills lower than the committee assessment
  • Most students and committees assess student communication skills ‘at’ or ‘above’ expectation
Observation = Professional Development Skills Learning Outcome 
  • Committees rank students ‘at’ or ‘above’ with no cases of ‘below’
  • Students routinely rank their Professional Development skills lower than committee

Feedback from USU Office of Data Analytics

2022 – 2023 Review

Graduate Program Assessment – MS and PhD Programs

The Department of Geosciences is committed to ongoing evaluation and improvement of our undergraduate programs. Each year, the Office of Data Analytics (ODA) reviews our assessment practices to ensure we are effectively measuring student learning and using results to strengthen our program. Below is a summary of the most recent review (AY 2022–23).

Strengths
  1. Clear documentation and transparencyThe assessment plan is available on the department’s website, making it accessible and transparent.
  2. Alignment with institutional standards – The plan follows the College of Graduate Studies guidelines, ensuring consistency with broader university expectations.
  3. Regular and systematic assessment – Assessments are conducted annually showing a structured and consistent approach.
  4. Integration of multiple assessment tools – The plan uses Individual Development Plans (IDPs), annual progress reports, self-assessments, and committee evaluations, providing a well-rounded view of student progress.
  5. Direct connection between assessment and learning outcomes – The categories on the evaluation form are mapped directly to program learning outcomes, ensuring that assessments measure what the program intends students to learn.
  6. Use of aggregated data for program improvement – Reports are aggregated and reviewed by directors and faculty to identify trends and guide program improvements, showing a feedback loop between assessment and decision-making.
  7. Committee involvement and individualized feedback – The supervisory committee plays an active role in evaluating and mentoring each student, emphasizing personalized oversight.
 Areas for Growth
  1. Clarify learning objectives: While skills and knowledge outcomes exist, most lack detail on how students will demonstrate mastery.
  2. Use more specific, measurable language: Some objectives would benefit from action-oriented verbs (e.g., analyze, apply, create) aligned with Bloom’s taxonomy to clarify expected student performance.
  3. Strengthen the link between assessment and instruction: Incorporate reflection on how outcome measurement results can be used to improve teaching and learning.
  4. Develop measurable benchmarks: Continue refining outcomes to include clear, quantifiable benchmarks directly tied to assessment activities, improving both clarity and accountability.
Next Steps

In response to the ODA review and our own observations of our assessment process, the Department of Geosciences will:

  1. Develop a rubric that defines what constitutes "below," "at," or "above" levels for each learning outcome to ensure consistency across committees and in student self-assessments.
  2. Outline when each part of the assessment process will be implemented each year, including timelines for data collection and review.
  3. Establish a data review and analysis process to document how data from rubrics and surveys will be collected, analyzed, and used to improve learning.
  4. Consider ways to include indirect measures and disaggregated analyses to provide a broader perspective on student performance.