How to Get Faculty to Learn from Mistakes: Reflecting on Questioning Techniques for Improvement

Years ago, as a graduate student, I was navigating the often-murky waters of academic research, specifically trying to understand questionnaire validation. Seeking guidance, I approached faculty members within my university’s psychology department, repeatedly asking, “Can you tell me how to validate the questions in my survey?” The responses were consistently polite yet unhelpful, often directing me to another professor who, ultimately, couldn’t provide the answers either. It was baffling that experts in psychology, a field heavily reliant on questionnaires, seemed unable to guide a student on such a fundamental aspect of research methodology.

This experience sparked a determination within me. After graduation, I embarked on a quest to demystify questionnaire validation, delving into library books and online resources. While I found information on question design and response scales, a comprehensive, start-to-finish guide on validation remained elusive. Undeterred, I pieced together information from various sources, developing my own systematic approach. This methodology has not only supported my publications based on questionnaire research but also earned me a grant to validate a questionnaire assessing clinicians’ perceptions of electronic decision-making tools. It seems journal editors and reviewers consider me knowledgeable in this area, perhaps because, much like my former professors, they too find questionnaire validation a somewhat ambiguous process. Ironically, I became an “expert” in a domain where expert guidance was initially so hard to find. This journey underscores a broader point: learning often arises from necessity and persistence, especially when established expertise falls short. This mirrors the challenges faculty face in refining their teaching practices, particularly in areas like questioning techniques, where learning from mistakes is crucial for growth.

This brings us to the core issue: How To Get Faculty To Learn From Mistakes Questions? Just as validating a questionnaire requires a structured approach, so does improving faculty questioning techniques through error analysis and reflection. The process, adapted from questionnaire validation principles, can be outlined as follows:

  1. Establish ‘Face Validity’ through Expert Review and Self-Reflection. In questionnaire validation, face validity involves expert review to ensure questions effectively capture the intended topic. For faculty, this translates to a dual approach. Firstly, engage in self-reflection on your questioning practices. Record your lectures or teaching sessions and critically analyze the types of questions you ask. Do they align with your learning objectives? Are they clear, concise, and stimulating? Secondly, seek peer review. Invite experienced colleagues or educational specialists to observe your teaching and provide feedback specifically on your questioning techniques. They can offer insights into whether your questions are effectively probing student understanding and promoting critical thinking. This external perspective is analogous to having a psychometrician review a questionnaire for common errors. Just as a questionnaire expert identifies flawed questions, a peer reviewer can pinpoint areas where your questioning strategies may be falling short.

  2. ‘Pilot Test’ Questions in a Low-Stakes Environment. Pilot testing a questionnaire involves trying it out on a small subset of the target population before full deployment. For faculty, this means experimenting with different questioning techniques in a low-stakes setting. This could involve trying new question types in a smaller class, a workshop, or even during office hours. Observe student responses and engagement. Are students confused by certain questions? Do some questions elicit more thoughtful responses than others? This pilot phase allows you to identify potentially problematic questions or questioning styles before they impact larger student groups or formal assessments. The goal is to gather preliminary data on the effectiveness of your questions, just as pilot testing reveals issues with survey questions before large-scale data collection.

  3. ‘Clean the Data’: Analyze Student Responses and Identify Areas for Improvement. In questionnaire validation, data cleaning involves checking for errors and inconsistencies in collected responses. For faculty learning from mistakes questions, ‘data cleaning’ translates to a thorough analysis of student responses to your questions, alongside broader feedback on your teaching. This could involve reviewing student answers in class discussions, analyzing responses to formative quizzes, or examining feedback from student evaluations. Look for patterns. Are there specific types of questions that consistently lead to misunderstandings or incorrect answers? Are there topics where students struggle to engage with your questions? This analysis helps pinpoint areas where your questioning techniques are not effectively facilitating student learning. Just as data cleaning in surveys identifies flawed data points, analyzing student responses reveals weaknesses in your questioning approach.

  4. ‘Principal Component Analysis (PCA)’: Identify Underlying Themes in Questioning Mistakes. PCA in questionnaire validation helps identify underlying factors or themes measured by a set of questions. While directly applying PCA to questioning mistakes is not statistically feasible, the underlying principle is valuable. Faculty can, through reflection and analysis, identify recurring themes in their questioning errors. Do you tend to ask leading questions? Are your questions often too vague or ambiguous? Do you primarily focus on recall rather than higher-order thinking skills? Identifying these recurring themes is akin to PCA revealing underlying factors in survey data. It helps faculty understand the core areas where their questioning techniques need refinement.

  5. ‘Cronbach’s Alpha’: Check for Internal Consistency in Questioning Approach. Cronbach’s Alpha measures the internal consistency of items within a questionnaire. For faculty, ‘internal consistency’ can be interpreted as the consistency of your questioning approach across different topics and teaching sessions. Are you consistently employing a variety of question types? Is your questioning style aligned with your overall teaching philosophy and learning objectives? Inconsistency in questioning can be confusing for students and less effective in promoting learning. Reflecting on the consistency of your approach, and seeking feedback from peers or mentors, can help ensure a more coherent and effective questioning practice.

  6. ‘Revise’: Refine Questioning Techniques Based on Analysis. The final step in questionnaire validation is revising the survey based on the findings from PCA and Cronbach’s Alpha. Similarly, the crucial final step for faculty is to revise their questioning techniques based on the insights gained from the preceding steps. This might involve consciously incorporating more open-ended questions, practicing wait time after asking questions, or focusing on questions that promote deeper analysis and application of knowledge. Revision is not a one-time event but an iterative process. Just as questionnaire validation may involve multiple rounds of pilot testing and revision, improving faculty questioning is an ongoing cycle of reflection, experimentation, and refinement.

Learning from mistakes is fundamental to growth in any profession, and teaching is no exception. By adopting a structured, reflective approach to analyzing and refining questioning techniques, faculty can continuously improve their teaching effectiveness and create a more engaging and impactful learning environment for their students. Just as a validated questionnaire yields reliable and meaningful data, refined questioning techniques lead to deeper student understanding and enhanced learning outcomes. The journey from recognizing mistakes to mastering effective questioning is a continuous process of professional development, one that ultimately benefits both faculty and students alike.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply

Your email address will not be published. Required fields are marked *