Assessment of a Data-Verification System for the General Chemistry


Assessment of a Data-Verification System for the General Chemistry...

0 downloads 107 Views 92KB Size

Information



Textbooks



Media



Resources edited by

Computer Bulletin Board

Steven D. Gammon Western Washington University Bellingham, WA 98225-9150

Assessment of a Data-Verification System for the General Chemistry Laboratory

W

Laura E. Pence* and Harry J. Workman Department of Chemistry, University of Hartford, West Hartford, CT 06117; *[email protected]

In the late 1980s, the chemistry faculty at the University of Hartford assessed the effectiveness of the general chemistry laboratory program and discovered a number of problems that were endemic to the introductory environment. Many students left the lab as soon as they finished the experimental work and did not stay to complete the required calculations. A natural student tendency to postpone doing work and a substantial number of students waiting to discuss their calculations with the lab instructor were contributing to the problem. As a result, the students carried out a minimum of data checking, delayed doing calculations, and simply forgot or misunderstood the significance of their recorded values. The consequence was poor quality discussions of the results in the lab reports; it is difficult to discuss a numerical result if one is unable to calculate that value! Laboratory instructors also found that it was very time consuming to check and grade students’ data and calculations, particularly since the students paid little attention to use of an appropriate number of significant figures. Poor student work practices such as waiting to do the required calculations until the night before the report was due, careless number-handling habits, and weak data-analysis skills had a strong negative effect on lab reports, which frustrated students and instructors alike. The critical importance of effective communication skills to industrial and academic chemists has been emphasized repeatedly in the literature (1–3), and numerous projects at other institutions have been implemented to address specific deficiencies with writing abilities (1–8). At this university, students at the freshman level were already writing complete lab reports as part of the focus on teaching students solid writing skills; however, the quality of the written reports was regularly being undermined by careless or incomplete data handling. As a result, the faculty sought a way to strengthen the students’ work habits as a means of improving the graded products of each experiment. Reinforcing fundamental principles such as accuracy in calculations and correct application of significant figures as a beginning method of estimating uncertainty were additional priorities. The aims of this project contrasted with other computer programs that have been implemented with the specific goal of easing the grading burden on the instructors (9–14). The three major goals of the initiative were (i) to improve students’ work habits, (ii) to require the students to revise their calculations, assisted by computer feedback, until each student’s data handling was 100% correct, and (iii) to store the data entered by the students to monitor the results of the experiment in general. To address these issues the faculty developed and implemented computer programs written in Pascal that would al668

Journal of Chemical Education



low students to enter measured and calculated results and subsequently receive feedback about mathematical accuracy and correct use of significant figures. Each program was written specifically for a single experiment; qualitative analysis experiments that did not require calculations were not included in the project. The programs had predefined data-entry locations indicated by “< >” as seen in Table 1, and a student was able to navigate easily from one entry location to another in any order required to enter or change values. After a full screen of data was entered, the computer program would begin the checking process. The programs did not correct the errors that were identified; instead the problematic entry and the type of difficulty were specified. Significant figures were checked first, followed by mathematical accuracy. Some quantities were also checked for whether or not the magnitude of the answer was reasonable based on parameters of equipment size, equipment limitations, and the requirements of the individual experiments. When all of the entries were correct and complete, a hard copy of the data printed automatically. If the data-entry or checking process was incomplete, a hard copy could still be printed but a notation appeared on the copy indicating the incomplete status. The hard copy of the completed process was a required component of the laboratory report, which motivated the students to stay in lab until they had successfully completed the verification program. A sample data-entry screen is presented in Table 1. Table 1. One Data-Entry Screen in the Volumetric Analysis Part A Experiment Instructor: Pence

Date:10/31/02

Experiment: Standardization of NaOH

Section: L1 Done by: Jean Little

Parameter

Quantity/Run #1

Mass of bottle and oxalic acid dihydrate

g

Mass of clean bottle

g

Mass of oxalic acid dihydrate

g

Molecular weight of oxalic acid dihydrate

g

Moles of oxalic acid dihydrate used

mol

Initial Reading of buret with NaOH

mL

Final reading of buret with NaOH

mL

Volume of NaOH used Molarity of NaOH solution Press F1 to go to the next page Press F2 to go to escape from the program NOTE: Student entries appear in bolded font.

Vol. 83 No. 4 April 2006



www.JCE.DivCHED.org

mL mL

Information

An additional benefit to these data and calculation verification programs is that they present an unusual strategy for consistently addressing the issue of significant figures throughout the semester. Like a number of general chemistry classes at other institutions (15–19), our first semester course includes an activity at the start of the semester to demonstrate the importance and relevance of significant figures. Unfortunately, the lesson seems to be rapidly forgotten by the students when they are confronted by increasingly challenging conceptual and mathematical material. The computer verification programs, used in more than half of the experiments in both semesters of general chemistry laboratory, provide a method of constantly emphasizing the need to use correct significant figures. Anecdotal evidence suggests that mastery of significant figures is retained beyond the freshman level. In a recent advanced synthesis course, which included both students who had taken general chemistry at this university and students who had taken general chemistry elsewhere, the students who had spent a year with the reinforcement of the computer verification programs had a vastly superior grasp of significant figures in simple calculations. The computer programs have been used consistently in the laboratories for more than fifteen years, and we have recently taken the opportunity to carry out a formal assessment of their utility. Qualitative Assessment It has been the extensive experience of the laboratory instructors that in the beginning, the students universally despise the computer programs. Students would prefer to postpone doing the calculations until some later time, and they never like to be told that their work is incorrect. Possibly their biggest frustration is that until this experience they have rarely been forced to keep working on a calculation until it is absolutely correct. The students’ initial grasp of significant figures is quite weak, and a mathematical error that affects an early quantity obviously requires the recalculation of all subsequent quantities. The verification programs do supply some feedback about the type of error that has been identified, such as significant figures versus arithmetic, but often the students prefer to guess at the correct answer rather than actually correcting their logic. The lab instructors can help alleviate the students’ frustration by emphasizing the positive aspects of the programs. The students do appreciate having the hard copy of the data, which does not have to be retyped for the report, and eventually they understand the benefits of knowing that their calculations are correct and complete before they leave the laboratory. The instructors have found that grading is significantly streamlined as a result of the computer verification programs, and the quality of the calculations and discussions within the reports improved substantially compared to reports from previous classes who did not use the programs. Since no project can solve all problems with student lab reports, a few limitations must be noted. To the annoyance of the lab instructors while they are grading, too often there is inadequate carry-over from the computer programs to the calculation section of the lab report. Even though the computer-printed copy of the data includes the correct measured and calculated quantities with correct significant figures and www.JCE.DivCHED.org





Textbooks



Media



Resources

units, the students sometimes use their original notebook values rather than the checked values, and when they write up their calculations, units may be omitted completely. During the lab period, the programs do clean up some of the student entries by removing spaces and restricting the type of entry to purely numeric for certain responses, but other data-entry errors are more challenging to identify. Some of the most frustrating problems occur when students try to correct a mathematical error by using the slightly inaccurate values from their notebooks instead of the corrected values entered into the computer program. It is difficult for an instructor to respond to a request for help with a particular calculation when the mathematical algorithm is actually correct and the student has simply neglected to use the updated numbers. Recognizing that Avogadro’s number has been incorrectly entered as 6.02 × 10᎑23 with a negative exponent rather than the correct positive exponent is another example of a very subtle data-entry error that even experienced instructors may initially overlook. Quantitative Assessment Starting in the fall 2001 semester, each computer program was altered to include error counters on each type of error. We have now collected data for two fall semesters to assess whether or not the computer programs result in an improvement of student performance over the course of a semester. Table 2 shows the analysis of the average numbers of errors per student per experiment. As shown in the table, the values for a single experiment are extremely similar between the two semesters, which suggests that our results are not random or due to anomalous groups of students. Because the number of individual computer entries varied for each different experiment and because the measurements and calculations ranged widely in complexity, we introduced several factors that attempted to normalize the statistical assessment and hence allow us to compare the student errors for each experiment more effectively. Factors were included to account for repetitious measurements and calculations in experiments with multiple runs, to differentiate between measured and calculated values, and to reduce the weight on calculations of mass, which require simple subtraction. Other than the mass calculations, we were unable

Table 2. Total Errors per Student in Each Experiment Average Total Errors

Experiment Gay–Lussac

Fall 2001

Fall 2002

3.83 (3.5)a

3.98 (3.5)

Mg/HCl

5.52 (4.0)

5.30 (3.9)

Volumetric analysis A

6.40 (6.2)

6.40 (6.2)

6.49 (4.4)

6.49 (4.8)

Hess’s law

Volumetric analysis B

11.53 (7.7)

11.16 (7.8)

Copper oxalate

12.67 (7.9)

13.76 (7.9)

Avogadro’s number

8.03 (5.0)

6.87 (4.3)

Avogadro’s numberb

10.33 (6.1)

10.29 (6.2)

a The errors are analyzed annually and collectively and have not been normalized for the number of values entered in each experiment or for repetition factors. The numbers in parentheses are standard deviations. b The second set of Avogadro's numbers is from the spring 2002 and 2003, respectively.

Vol. 83 No. 4 April 2006



Journal of Chemical Education

669

Information



Textbooks



Media



Resources

to control for the differing complexity of each calculation, so we were aware that this remaining factor could influence our results. Additionally, we also ultimately based our comparisons on the calculated values in each experiment, which are subject to both arithmetic and significant figure errors, rather than including the measured values, which do not suffer from mathematical errors. When selecting which data from the student error statistics to include in our analysis, we added two additional criteria. During experiments in which the students work with partners, we have often observed occasions in which one student carried out the majority of the calculations, and the second student simply used the first student’s printed copy of the correct results to enter his or her own information into the computer. Since these situations obviously did not reflect the actual performance of the second partner, instances in the computer program database in which a second partner had no errors were eliminated from the student error statistic data set. Additionally, at the end of a long or frustrating lab period, some students tend to rely on random guessing rather than actual recalculation to remedy incorrect significant figures, so this situation usually results in an excessively high number of errors. As a result, students whose arithmetic or significant figure error counts were more than two standard deviations from the mean were removed from the data set for that experiment. The standard deviations in Table 2 express the resulting range in the number of errors made by the students after the outlying data were removed. We were aware that both of these adjustments potentially introduced a systematic bias into the data, but we were satisfied that we could draw at least general conclusions from the adjusted results. The analysis of two fall semesters of student performance with the computer data- and calculation-checking programs is shown in Table 3. In the final column of Table 3, the ratio of the number of errors per calculation, adjusted as described above, allows for comparisons among different experiments. We used these resulting numbers to analyze the students’ progress over the semester. Overall, the number of errors per calculated value, or student error rate, decreases from the beginning to the end of the semester. The higher student error rate associated with the first two experiments is expected as the students gain experience with the computer system and with significant figures. Student error rates for subsequent experiments remain reasonably consistent, in spite of being superimposed on an increasing complexity of calculations. The copper oxalate

Table 3. Comparison of the Error Rates per Student Experiment

Week of Semester

na

Error Rateb

Gay–Lussac

2

176

1.10

Mg/HCl

3

227

1.35

Volumetric analysis A

5

258

0.92

Volumetric analysis B

6

212

0.93

9

196

0.89

Copper oxalate

Hess’s law

10

172

1.26

Avogadro’s number

11

201

0.97

a

The number of student data sets after outlying data sets were removed. b Number of errors per adjusted number of calculated values.

670

Journal of Chemical Education



experiment stands out as having an anomalously high error rate for late in the semester. This experiment requires individual gravimetric and volumetric procedures, and the calculations are substantially more complicated and involve less repetition than the rest of the experiments, all of which contributes to the students’ demonstrating reduced accuracy and patience when processing their data. The otherwise consistent reduction in the student error rate was observed over several semesters, even when the order of the experiments was changed. All of these results demonstrate the students’ greater facility with significant figures and calculations after a semester of reinforcement through the computer data-verification programs. Additional Benefits

Data Pooling The laboratory information database created from these computer data-verification programs has proven extremely convenient for a number of other uses in addition to the reinforcement of good work habits and the correction of the students’ data. The pair of volumetric-analysis experiments that we use in the general chemistry involve two weeks of titrations. In the first week, the students standardize the NaOH solution that they will need for the subsequent week. Rather than have an instructor standardize the solution to find the concentration, the laboratory supervisor uses the students’ data contained in the computer database. In this process, approximately sixty values are combined to generate the working molarity, which should be a more accurate value than any single person’s work. Grading Student Performance Grading is another activity that benefits from the information in the database. Most obviously, instructors no longer need to check students’ calculations by hand. In addition to that advantage, the database provides the ability to assess the quality of a student’s results objectively. A component of each student’s grade is based on performance, and about half way through the fall semester a few performance points are assigned to a student’s ability to produce reasonably accurate results. For these experiments, the database is used to establish an average value and standard deviation of a calculated quantity, such as the molarity in the standardization of NaOH. Students whose results are within one standard deviation of the average of all the students in the data set receive full credit. For each standard deviation outside the average, the students lose a point. This method allows instructors to grade to a small extent on technique, but does not penalize the students for a potentially systematic error caused by the procedure, equipment, or chemicals. Evidence of Cheating A rather unexpected use of the database has been in detecting and documenting cheating. Because all of each student’s data are recorded in the master computer database, duplicate data sets between lab sections or between different semesters may be documented impartially. For example, when an upperclassman’s freshman laboratory notebook was stolen, it was easy to pull up his data from the historical database and watch for his numbers to appear in the current

Vol. 83 No. 4 April 2006



www.JCE.DivCHED.org

Information

semester’s data. A second incident of cheating, which would not have been spotted without the database, occurred when students were finding an experimental value for the percent acetic acid in vinegar. In a lab section of 24 values, the single value of 3.50 stood out from the rest of the numbers, which ranged from 4.10 to 4.50. Upon further investigation, the student who had obtained the value of 3.50 had taken the class the previous year, when the prepared samples of vinegar did indeed contain approximately 3.50% acetic acid. It was easy to check the previous year’s data and verify that the student had simply reused her entire set of previous data rather than repeating the experiment. Although dealing with cheating is one of the least enjoyable aspects of teaching, the indisputable evidence from the freshman laboratory computer database has allowed us to handle these situations decisively and internally without merely pitting a professor’s word against a student’s. Implementation To implement a system similar to the one described, there are number of obstacles to overcome. Even under ideal conditions a significant effort is required to develop templates for each experiment associated with a course. We offer several options for other faculty who wish to develop similar systems at their own schools. First, the executable files for several of our programs are available in the Supplemental MaterialW and may be downloaded. (Those interested in source code for the programs may contact the authors directly. There are conversion programs that may be used to convert the Pascal code we used into C, which is a more common programming language.) Second, the WebMark system (9), which uses Filemaker Pro, is an example of a newer software system that has been applied for a similar purpose. We agree with the WebMark authors that there is indeed a significant learning curve when the programs are first being created, but the process speeds up as the programmer(s) acquire experience. Third, the Labsystant software package from Trinity Software appears to have many of the same student feedback features contained in our programs, but we are unsure it can completely duplicate all the features of our system. For faculty who are interested in implementing a dataand calculation-verification system for the laboratory, we recommend several considerations: • Is the program intended to grade the students’ work automatically or is it intended to provide feedback to the students about the accuracy of their calculations? • How will the program affect students’ work habits and time management skills? Do you want the program to be available on the Web outside of laboratory time or only in the laboratory? One of the major goals of our project was to enforce that the students must do the calculations before they leave lab. Hence, it can actually be an advantage that the programs are not available on the Web and are only available in the labs. • Are the data collected in a single database that may be used for pooling data or identifying cheating?

www.JCE.DivCHED.org





Textbooks



Media



Resources

Summary The results of our assessment were that the computer data-verification programs are a definite success. Carrying out calculations at the last minute is no longer an option since the students must execute and complete calculations before leaving the lab. This change corresponded to a dramatic improvement in the discussion sections of the lab reports since the students had correctly calculated results to discuss. The computer programs had the additional benefit of consistently reinforcing the need for accuracy and significant figures in calculations; after the initial learning curve, these skills remained steady throughout the semester in spite of increasingly complex calculations. Standardization and grading became substantially easier as a result of implementation of the programs, and finally because cheating was easier to detect and document, it was handled with far greater facility than in the past. Acknowledgments The authors would like to acknowledge Edward Gray for his contributions to the project of writing the computer data-checking programs and Jean Roberts for her data management and for her additional insight into student use of the programs. W

Supplemental Material

The executable files for several of our programs are available in this issue of JCE Online. Literature Cited 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19.

Tilstra, L. J. Chem. Educ. 2001, 78, 762. Olmstead, J. I. J. Chem. Educ. 1984, 61, 798. Varnes, A. W.; Wetmore, D. E. J. Chem. Educ. 1975, 52, 801. Bailey, R. A.; Geisler, C. J. Chem. Educ. 1991, 68, 150. Rosenthal, L. C. J. Chem. Educ. 1987, 64, 996; references therein. Goodman, W. D.; Bean, J. C. J. Chem. Educ. 1983, 60, 483. Werner, T. C. J. Chem. Educ. 1986, 63, 140. McKaig, N. J. J. Chem. Educ. 1963, 40, 86. Olivier, G. W. J.; Herson, K.; Sosabowski, M. H. J. Chem. Educ. 2001, 78, 1699. Aikens, D. A.; Bailey, R. A.; Strong, R. L. J. Chem. Educ. 1988, 65, 343. Myers, R. L. J. Chem. Educ. 1986, 63, 507. Deutsch, J. L.; Zaleznak, H. N. J. Chem. Educ. 1976, 53, 308. Johnson, R. C. J. Chem. Educ. 1973, 50, 223. Wellman, K. M. J. Chem. Educ. 1970, 47, 142. Pacer, R. A. J. Chem. Educ. 2000, 77, 1435. Kirksey, H. G. J. Chem. Educ. 1992, 69, 497. Abel, K. B.; Hemmerlin, W. M. J. Chem. Educ. 1990, 67, 213. Guymon, E. P.; James, H. J.; Seager, S. L. J. Chem. Educ. 1986, 63, 786. Anderlik, B. J. Chem. Educ. 1980, 57, 591.

Vol. 83 No. 4 April 2006



Journal of Chemical Education

671