Automating the assessment of CAD models has been the focus of significant research efforts. One focus of this has been in its application to grading in support of training of engineering students in 3D parametric modeling skills and practices. However, there continue to be significant challenges in producing broadly acceptable tools of practice due to the complexities involved in creating a CAD model and in identifying formal evaluation criteria that robustly capture whether skills have been acquired. Of interest is whether tools can be developed that provide more robust formative assessment of a modeling activity. This contrasts with summative assessment approaches which largely benefits the assessor in reducing grading times by evaluating the result but can miss important tendencies in a student designer that might need to be corrected. For this to be feasible better metrics that reflect how a modeling activity is progressing not just with respect to realizing a final shape goal, but also in capturing design intent and meeting best practices is needed. In this paper some of the challenges of evaluating 3D CAD modeling efficacy are explored. These challenges increase with the level of complexity desired in the result which can range from just creating a final 3D shape to capturing design intent and finally skill at incorporating best practices. A case study of a capstone modeling project given to students in an introductory CAD class is used to illustrate these challenges. This example also highlights the difficulties encountered with assessing more open-ended modeling experiences when students are given less guidance and have many more options that they can use in satisfying the modeling requirements. A simple case study is also presented to demonstrate the viability of collecting a more complete set of assessment metrics during a modeling activity. A discussion of how access to a richer set of metrics might lead to a better understanding of modeling tendencies is presented.
Are you a researcher? Would you like to cite this paper? Visit the ASEE document repository at peer.asee.org for more tools and easy citations.