Placeholder Image

字幕列表 影片播放

  • [MUSIC PLAYING]

  • DAVID MALAN: This is CS50.

  • [MUSIC PLAYING]

  • DAVID MALAN: Hello, world.

  • This is the CS50 podcast.

  • My name is David Malan.

  • BRIAN YU: And my name is Brian Yu.

  • And today, we thought we'd discuss academic honesty in CS50.

  • And so every year in CS50, we always have some number of cases

  • of academic dishonesty where some number of students

  • submit work that isn't their own, either by copying homework from a friend

  • or by looking something up online and using a solution they

  • find online as part of their solution.

  • And so this is something that CS50 has had to deal with for years

  • now in terms of how best to address this type of situation,

  • and how best to prevent academic dishonesty in general.

  • DAVID MALAN: Indeed this was-- when I first took over the course

  • myself back in 2007, it was really an end of semester process.

  • After the teaching Fellows would evaluate student's work

  • and provide feedback throughout the semester,

  • I would finally, all too often by semester end,

  • carve out some time in order to then cross compare all of the submissions

  • from that semester looking for statistically unlikely similarities

  • between students work.

  • Indeed, what a student might sometimes unfortunately do

  • is copy the work of another student, lean too heavily

  • on some resource online, copying more than a reasonable number

  • of lines of code.

  • And so by cross comparing all submissions with software

  • itself, do we then notice which lines of code

  • are in both student A student B's work, and then conclude ultimately,

  • that statistically this was unlikely to happen.

  • BRIAN YU: Now, how exactly do you draw those conclusions.

  • Because I'm thinking about a programming language like C,

  • there are only so many parts of the language.

  • Their for loops and their conditions.

  • And probably everyone's solutions to similar problems

  • probably have these sorts of elements.

  • So what exactly do you look for in this process?

  • DAVID MALAN: Yeah, it's quite fair.

  • If we relied on this kind of cross comparison

  • for programs like Hello, World, everyone would appear

  • to have written exactly the same code.

  • But as soon as we get into CS50's second and third weeks

  • where the programs they write in C tend to get a little longer,

  • there does end up being more opportunity for creativity,

  • for different stylized actions by students.

  • And so students code does start to drift.

  • Even though at the end of the day the solutions

  • might still be using for loops and while loops and conditions and so forth,

  • students might format their code slightly differently.

  • They might write slightly different comments.

  • And so what tends to happen over time, as the programs exceed

  • maybe 10, 20, 30 lines of code, is there enough variation?

  • And indeed, unfortunately, what we often notice

  • is not even necessarily that the code is identical, because as you know,

  • that in and of itself might just be a coincidence.

  • Especially, when nowadays we have 800 students,

  • it is absolutely going to be the case that two students write,

  • by chance, very similar code.

  • But unfortunately, the kinds of things we tend to notice

  • is when students have the same typographical errors,

  • or they use precisely the same variable names,

  • or they make precisely the same mistake in precisely the same location.

  • And at that point, our instincts start to kick in

  • and we look at code like this and start to realize,

  • while this may have happened by chance, on scale

  • the odds that had happened in this line and in this line

  • and in this line between two students code is

  • just more likely than not better explained by some deliberate act.

  • BRIAN YU: So at Harvard at least, when there are cases of academic dishonesty,

  • they're usually referred to some administrative body, which

  • now is called the Honor Council here at Harvard.

  • And I think you've pointed out and a couple other people

  • have pointed out that CS50, though it is the largest course that the university,

  • does refer far more people to the Honor Council like any other class on campus.

  • Do you think that has to do with something about computer science

  • or introduction to computer science?

  • Or why do you think that might be?

  • DAVID MALAN: No, I don't.

  • And that's certainly an unfortunate distinction that we've long had,

  • say for, one or two years where there are issues in other departments.

  • No, I don't think that computer science students are any less honest

  • than their classmates in other fields.

  • I don't think students in CS50 or any less honest than students

  • in other computer science courses.

  • I think it really boils down to one, you and I and educators in computer science

  • are perhaps somewhat uniquely positioned with tools--

  • with software tools via which to detect it.

  • And in a large introductory course like CS50,

  • I think it's important not only out of fairness to those students

  • who are behaving honestly throughout the term, but also because one of our goals

  • should be in this course, to teach students

  • the ethical application of computer science.

  • That we should be holding students to those same expectations as

  • are prescribed in great detail in the courses syllabus.

  • And so I think it's really a function of our one, looking for it.

  • And to two, through on it that really ends up explaining the large numbers.

  • BRIAN YU: Yeah, so I'm looking here at the data from past years in CS50,

  • and it does seem that there's also a fair amount of fluctuation

  • in terms of what percentage of students in the course end

  • up being referred to the Honor Council.

  • Like, in 2009 for example, it looks like nobody

  • was referred to the Honor Council.

  • And in other years like 2010, 2012, there's like 1% or 2% of students.

  • But in other years like 2015, it's up to 5%, 2016 is up to 10%.

  • What do you think accounts for that fluctuation

  • because that's a pretty big difference between one year and another?

  • DAVID MALAN: Yeah, there really has been as you say, from 0% to 10%

  • depending on the year.

  • I think it's a few things.

  • Part of it I think is just a function of how much time

  • I or we put into the process.

  • I think the year in 2009 when there were 0%, I did look for worrisome instances

  • at that particular year, but admittedly in retrospect, I probably

  • spent less time that year than the subsequent year.

  • Because the subsequent year it went up to 2%.

  • With that said, it might have been by chance,

  • just a group of students who exhibited this pattern of behavior

  • with far less frequency than others.

  • So I think that's certainly possible as well.

  • But I think the uptick in more recent years for instance, 10% in 2016

  • and roughly 4% or 5% then, which is where

  • we've been rather in equilibrium the past few years,

  • I think is also a function of just how much time we invest in it.

  • So back in 2008, and for a few years there after,

  • it was only me who is engaged in this process.

  • I would run the software by myself.

  • I would look at students submissions side by side.

  • And I would ultimately decide which to refer forward

  • to Harvard's Honor Council.

  • And then ultimately, document all those cases.

  • But in more recent years have we involved more of CS50s senior staff

  • in the process.

  • The upside of which is that we can now one, analyze the submissions roughly

  • on a week to week basis.

  • The upside of which is that we can provide the Honor Council

  • with the tails far more quickly.

  • Students themselves, while though, never a pleasant

  • process at least no sooner rather than later, rather

  • than getting to the entire end of the semester

  • and then realizing just how many or how often they cross some line.

  • But two, the fact that we have multiple human eyes on it

  • means that we do allocate more time week to week

  • on each of the individual submissions and the crossways comparisons thereof.

  • The upside though of those multiple humans,

  • we now have two or three of us who ultimately vote on whether or not

  • a case should move forward to the Honor Council is that I at least,

  • and hopefully all of us, have much more comfort in sending a case to the Honor

  • Council because not one pair of eyes, but two or three

  • have all adjudicated it to be a clear indication of a line

  • having been crossed.

  • BRIAN YU: Can you tell me a little more about that process?

  • You've talked about now that there are now

  • a couple of eyes that are all looking at the submissions,

  • but you've also talked about software being involved too.

  • So what is the interplay there between the role that software

  • plays in trying to detect this sort of thing and the role

  • that people play in trying to detect academic dishonesty?

  • DAVID MALAN: Yeah, I should first emphasize

  • that it is not software that is ultimately

  • disciplining students or referring them to Harvard's Honor Council.

  • It is rather just a tool that we use as a first pass.

  • Given that we have some, nowadays, 800 students, each of whom

  • are submitting 10 homework problems over the course of the semester.

  • This is a big O of-- n squared problem times 10 or so.

  • So it's a huge number of comparisons that need to be made,

  • and it just wouldn't be practically done by hand or by eye alone.

  • So what we do is run software that literally cross compares

  • every submission against every other submission

  • sometimes, within the current year or even, based on our archives,

  • against recent prior years as well which explodes the problem even more.

  • And what we get out of that software based process

  • is a list from top to bottom of pairs of submissions that the software considers

  • worrisome least similar.

  • And then we, the humans, typically go through the top 50 or the top 100

  • matches on that list and use our human eyes and our own experience

  • and our instincts to decide, ah, this just happened by chance

  • or, oh, as you said, this is a relatively short program like Hello,

  • World or Mario.

  • This is just bound to happen at that point in the semester.

  • But certainly as the problems get more sophisticated

  • and the code gets longer is it more clear to multiple humans that, hmm,

  • looks like something's awry here, especially when it is again,

  • the same variable names or the same comments or worse,

  • the same comments with typographical or grammatical errors

  • in exactly the same place, odds are that's much more

  • likely to indicate copy paste than it is two students independently

  • in their own rooms, on their own laptops literally writing

  • in the same place the same errors.

  • BRIAN YU: Makes sense.

  • And it's also interesting that depending on the type of software

  • that you use, in the same way that a compiler can take a C program

  • and figure out what is the structure of the program

  • and compare the structure of a program to another,

  • that these sorts of comparison programs can do the same thing.

  • They can take two pieces of code, and even

  • if they might use slightly different variable names,

  • can still look at the structure of the program as a whole

  • and try and compare them against each other

  • to do some more sophisticated comparisons.

  • DAVID MALAN: Yeah, and thanks to some of CS50s team members, Chad

  • and [? Yella ?] and Kareem, we now have our own tools, Compare50,

  • which automates this process for us.

  • And you can perhaps, given your experience in the space,

  • speak a little more perhaps to the algorithmics underneath the hood?

  • BRIAN YU: Yeah, it is really Chad and [? Yella ?]

  • and Kareem that were doing a lot of the work there.

  • But algorithmically, it's sort of an interesting challenge

  • to figure out how to do these sorts of comparisons.

  • Because even though it might seem like a computer

  • is obviously going to be able to do it faster

  • than people are going to be able to do it,

  • it's still a lot of work even for a computer.

  • Especially, if you consider like 800 students in the class being compared

  • against all of the other students, plus all of the students who have ever taken

  • CS50 before, not only for one problem, but for all of the problems

  • in the course.

  • That's a lot of work for any computer to do.

  • And so there is a lot of interesting algorithmic efficiencies

  • that have been put into the software in order

  • to make it work a little bit better.

  • Trying to take advantage of things you actually learn about in CS50.

  • Things like hashing in order to store data inside of a hash table

  • so you can very quickly look up whether or not

  • you've seen a particular pattern of characters in a file before.

  • Those sort of data structures all come into play if you start to think about,

  • how do you try and solve this problem in a way that's efficient?

  • DAVID MALAN: Yeah.

  • And besides software, certainly our own policies have evolved over time.

  • So you know for instance, that in a few weeks time,

  • we'll be presenting at a computer science education conference called

  • CSEIT a recent paper that a few of us worked on

  • based on our experience with issues of academic dishonesty

  • over the past few years.

  • And it's perhaps worth noting that software aside,

  • I think one of the more noteworthy policy changes

  • we introduced some years ago was CS50s so-called Regret Clause.

  • Which was just a single sentence that we added to the courses

  • syllabus that encourage students to come forward

  • if within 72 hours of submitting some work,

  • they realized that, oh, they had indeed crossed some line.

  • They had copied unduly from some resource online.

  • They had copied some portion of code from a classmate

  • or otherwise, somehow other across the line

  • that was prescribed in the course of syllabus as being not reasonable.

  • And what we committed to doing in writing in the courses

  • syllabus was there would still be penalty

  • and there would still be consequence, but it

  • would be limited for instance, to our zeroing the problem or the problem

  • set that the student had submitted.

  • And we committed not to escalating the matter to Harvard's Honor Council.

  • The hope was that we could actually turn what had historically

  • been purely punitive processes whereby we detect some transgression,

  • we refer it to the Honor Council, and there

  • after the student is penalized in some way, the most extreme outcome of which

  • might actually be required time off from Harvard University itself.

  • We wanted to create a window of opportunity

  • where students after some sleep, some thought, some reflection,

  • can actually own up to a mistake.

  • Because for so many years, so many of our cases

  • were truly involving students who at 2:00 AM 3:00 AM 4:00 AM are

  • under very little sleep, under significant amount of stress,

  • and with a deadline not only in CS50, but perhaps some other course looming,

  • made some poor decision to take the quick way out

  • to just copy and paste someone else's work and submit it on their own.

  • And even if they've decided or realized a day or two later, wow,

  • really didn't mean to do that.

  • Really shouldn't have done that, we had never described

  • a well-documented process for how they should handle that

  • and how they could own up.

  • And so this Regret Clause was meant to help ideally chip away

  • at the total number of cases we were seeing.

  • But ultimately, help students meet us halfway

  • so that it becomes more of a teachable moment

  • if you will and not just punitive.

  • BRIAN YU: So I remember when I first took CS50 in fall 2015 it was,

  • I remember seeing the Regret Clause in the syllabus.

  • And I remember being a little surprised.

  • Because it wasn't something I had seen before.

  • It's not something that many other classes do.

  • Not really anything that I was familiar with.

  • So I'm curious about where the policy came from?

  • Was it inspired by any other policy?

  • Or where did you start to find your way to this idea?

  • And what was the process like for bringing this into the course?

  • DAVID MALAN: Yeah, it was really inspired

  • by having, for almost 10 years, watched the number of cases

  • come through CS50 and watching the circumstances that ultimately explain

  • them.

  • Again, these late night poor decisions under a great stress.

  • And it just felt like we, the teachers of the course, should be doing

  • or could be doing a more proactive job at trying to tackle this problem.

  • And not just looking to detect it, but looking to teach students how to one,

  • ideally avoid it altogether.

  • But two, even if they do cross some line how to address the situation then.

  • And yet, it was not with great ease that we rolled this out.

  • There were absolutely some sensitivities on campus among administrators,

  • among the universities Honor Council, who had long standing processes when

  • it came to issues of academic dishonesty, not only for CS50,

  • but all courses at Harvard.

  • The upside of course, is that by having a central body, Harvard's Honor

  • Council, adjudicate all of these cases, you have uniform processes.

  • You hopefully have more equitable outcomes overall.

  • And there was great concern initially in some circles

  • that we were now doing something more on our own internally.

  • And so it only debuted after quite a few conversations with Harvard's Honor

  • Council and administration so that we can ultimately

  • get folks comfortable with what, at the time, was an experiment,

  • but now is an ongoing six year policy for us at least.

  • BRIAN YU: All right so now six years in, policy

  • has been around for a little while.

  • Do you feel like it's done what you expected it to do?

  • How does it compare to what your original goals and objectives were

  • for what the policy would do for the class and for students?

  • DAVID MALAN: Yeah, so we hoped that it would actually

  • chip away at the total number of cases that we

  • were referring to Harvard's Honor Council,

  • but it did not in fact, do that.

  • Interestingly enough, the number of cases

  • we have referred to the Honor Council since have been roughly the same

  • or even higher in some years than prior to the Regret Clauses introduction.

  • We had the wonderfully successfully and nontrivial number

  • of students avail themselves of this clause.

  • Most years so in the court clauses first year,

  • 2014, we had 19 students come forward under this clause,

  • reach out to me in the courses hedge, generally by way of an email first.

  • After which we would then schedule time to chat with me.

  • And I would chat with these 19 students one on one

  • and better understand what had happened and what had they done.

  • Better understand what circumstances had led

  • to them having made whatever decision it was we were then discussing.

  • And then ultimately, explicitly tell them, all right,

  • let's consider the matter behind us.

  • After zeroing the particular work in question

  • to reassure them that this was indeed the end of that process.

  • But the beginning, hopefully, of a healthier approach to future problems

  • sets.

  • And we would then encourage them to--

  • and discuss with them ways for better managed managing their time,

  • better managing their stress.

  • In some cases, too, it came to light that there

  • were extenuating circumstances.

  • Students struggling with issues at home, with their family,

  • with relationships, with other courses, issues of mental health.

  • And so what was a pleasant revelation to us

  • was that we were able more proactively than had been possible in the past

  • to connect students with support resources on campus,

  • whether academic in the case of tutoring,

  • or perhaps health in the way of mental health.

  • So that too seemed to be a positive outcome and the experience

  • that we were able to connect up to 19 students

  • that first year with other resources on campus.

  • And there after it fluctuated.

  • In 2015, we had 26 students.

  • In 2016, we had seven students.

  • Then it went back up in 2017 to 18 students.

  • And I think this variation is partly just a function

  • of messaging on our part, on my part.

  • How much time we spend in lectures and in emails

  • during the semester reminding students of the policy's availability.

  • I also suspect that there's some ebb and flow based on the current--

  • the given year.

  • If more students in this class know that a student in the previous year

  • might have invoked this clause there just might be broader awareness of it.

  • But it's been a good number of students, I think every semester.

  • However, the fact that we didn't see a downturn in the number of cases

  • we referred too was also a surprise.

  • In fact, in the first year of the Regret Clauses existence,

  • it turned out that most, if not all of the students

  • that invoke the Regret Clause did not even appear on our radar

  • when we ran our software based cross comparisons of their work.

  • Which suggested that had they not come forward,

  • we actually would not have noticed and they would not

  • have been connected ideally with these resources.

  • And so that too was a bit of a surprise.

  • These students invoking the Regret Clause

  • dare say composed a different demographic of students

  • that we hadn't yet previously identified.

  • Students who had indeed crossed some lines in many cases,

  • but that had not been connected with or been

  • offered some teachable moment that might actually help them course correct.

  • And I should note too, that of the 19 students, 26 students, and so forth,

  • not all of them it had indeed crossed some lines.

  • In several cases each year, were students unnecessarily worried.

  • And so I would simply reassure them and thank them for coming forward,

  • but not to worry, you've navigated the waters properly.

  • BRIAN YU: Yeah, it's really interesting that now

  • by reaching this other demographic, you've

  • been able to have these sorts of chats that otherwise may not

  • have been able to happen and connect them with other kinds of resources.

  • I'm curious as to what are the kinds of advice you

  • give to students that find difficulty with time management and stress?

  • Because I think this is not a unique problem to CS50

  • that and other computer science classes are just in school in general

  • or even outside of school.

  • Like, time management, stress, managing these things and making good decisions

  • is--

  • it's challenging.

  • And something that I'm sure many students and other people face.

  • DAVID MALAN: Yeah, absolutely.

  • To be honest, it's fairly straightforward things.

  • It's things that we even put in the courses syllabus or FAQs often.

  • For instance, in a programming class like ours, start early.

  • You have nearly seven days from start to finish for each programming assignment.

  • And the key to avoiding a lot of the stress

  • is to just start early, so that when you do invariably hit a wall

  • or encounter some bug that you just can't quite see, you can go to sleep,

  • you can go for a run, you can take a shower.

  • You can take a break from it and come back to it

  • some hours or even a couple of days later and have that perspective.

  • I mean even I found in the real world that I do not produce good code when I,

  • myself am under stress.

  • It's no fun.

  • It doesn't yield correct results.

  • And so really helping students realize that, it is a relatively simple fix.

  • They just really need to take charge and commit themselves to that.

  • Besides that, it's often a matter of referring students and reminding

  • them of the many resources that the course offers on campus, whether it's

  • the courses lectures, or sections, or office hours, or notes or tutorials,

  • or any number of online and in-person resources.

  • And just reminding themselves that you need to meet the course halfway

  • and take advantage of these resources.

  • And it's no surprise that you are struggling

  • if you're not availing yourself of at least some of these resources.

  • BRIAN YU: Yeah, actually it's always incredible to me

  • when on our problems at forums, we always ask students like, on what day

  • did you start the problems set?

  • And so many students respond like the day of the deadline or the day

  • before the deadline for a project that we wrote with the expectation

  • that it will take students a week to complete it.

  • And students are trying to do it like day of or day before.

  • It always amazes me the number of cases where that ends up happening.

  • DAVID MALAN: Yeah, so I think the more we can send that message even before we

  • get to the point of a student having regret

  • clause this conversation, the better.

  • I should note though too, that another surprise effect of the regret clause

  • was not even that we-- or the number of cases we referred didn't go down,

  • but rather at least in at least one year they went significantly up.

  • In 2016, and as you noted, is when we had 10% of the courses student body.

  • So this is 10% of the students taking CS50 referred to the courses--

  • to the university's Honor Council.

  • But to be honest that too was in part.

  • And I think our numbers since have been partly a reflection of our feeling

  • that when we do detect what appears to be a straightforward

  • case of academic dishonesty, plagiarism of some sort, duplication of code,

  • these days, I think I personally am even more comfortable referring the case

  • than I was in years past because we have given students an opportunity

  • to meet us halfway and reach out.

  • And indeed, as you know, in every one of the courses

  • problem sets this year on the form via which they submitted their work,

  • we asked them to check a checkbox to acknowledge their understanding

  • of the clauses availability.

  • And so at that point, if we are not only reminding students

  • each week that it's available and they are not thereafter

  • taking advantage of it, it seems quite reasonable,

  • I think, for the course to move forward with the more

  • traditional punitive process involving the Honor

  • Council to investigate whether indeed the line had been crossed.

  • BRIAN YU: I'm curious.

  • So we often talk now about like the line being crossed

  • and what it means to cross the line.

  • I'm curious about how you see this in the context of programming assignments

  • in particular. like if you're writing an essay

  • and you copy a sentence, that seems like very clearly copying.

  • But in the case of code if you copy a line of code

  • you see from Stack Overflow for example, if you're looking up like,

  • how do I solve this particular problem, and you incorporate a line of code,

  • that that might not be crossing a line.

  • So how do you think about where the line is in the context of a programming

  • assignment?

  • And how to teach that kind of thing?

  • DAVID MALAN: Yeah, it's a really good question.

  • And it's a common question, because I think

  • there's a perception among folks both in the software

  • world and non-software world that this notion of academic dishonesty

  • in a programming class itself is incompatible with the idea

  • of programming.

  • And I do very much disagree with that.

  • The lines that we prescribe to students, both in broad strokes

  • and in very precise bullets in the courses syllabus, essentially

  • try to teach students to be reasonable so to speak.

  • And what might that mean?

  • Well, early in the semester in CS50, we of course,

  • have students in C, and later in Python implement Mario's Pyramid.

  • So a sort of pyramid-like structure just using some ASCII art to paint that

  • picture.

  • And it involves ultimately like a couple of for loops.

  • It would be unreasonable for students to go off and Google or look

  • on Stack Overflow for something like, how print Mario's Pyramid.

  • That would be a search for the outright solution to the problem.

  • And surely it is not our intent to assess you

  • on your ability to Google a solution like that as opposed

  • to crafting it yourself.

  • However, it would be very reasonable for instance, to Google something

  • like, how write nested for loops in C. Or how print spaces in C.

  • Because it's actually not obvious to students one, how you can actually

  • have two loops and one nested inside of the other

  • using different counting variables.

  • And two, how to print would appear to be blank spaces on the screen,

  • not quite appreciating that it's actually just the SPACEBAR.

  • So I think it's very reasonable for students

  • and it is allowed in the course syllabus to look for short snippets

  • so to speak of code.

  • Where a snippet itself is one line, few lines,

  • but it is not the essence of the problem.

  • And so indeed when we do find that students

  • have crossed the line, what has happened is

  • we notice some curiosity about their code.

  • It's maybe very similar to another student's code

  • or it suggests a technique that we haven't

  • taught in the class or some syntax that's not consistent with what we

  • know students have seen in the class.

  • And so we ourselves might Google certain key phrases or portions of code

  • or comments that we see in their code.

  • And sure enough, it too often leads us to the very same GitHub repository

  • or Reddit post, wherein someone else has posted exactly that same code

  • that the student has copy pasted.

  • And so there too, the kinds of cases we are referring

  • are not the many, many, many students code

  • who very reasonably use these kinds of digital resources.

  • But the ones who use these resources, and then take

  • shortcuts to submission as by just copying and pasting many lines of code

  • that they see.

  • BRIAN YU: So other than the Regret Clause now,

  • which we've talked about for a little while,

  • have there been any other things you've thought about doing

  • or things you have done to the course in terms

  • of thinking about how to either address academic dishonesty when it happens

  • or to try to prevent it beforehand?

  • DAVID MALAN: We have.

  • So couple of years ago, we introduced the courses Brink Clauses, so to speak.

  • Which was a couple of sentences inspired by a colleague of ours at Princeton,

  • Chris Moretti, who gave us some really inspiring language that

  • encouraged students in the courses syllabus to write us late at night

  • just as they felt themselves being on the brink of making a poor decision.

  • That is to say, even when you and I and most of the courses staff

  • might be asleep and a student might be working late at night on their work,

  • it would be reasonable to assume that they

  • could get a response to a request for an extension for instance.

  • And so with this brink clause prescribed was

  • a mechanism for students to send that note to say, listen,

  • I really feel like I'm in a bad place.

  • And I worry I'm about to make a poor decision as by copying and pasting

  • too many lines of code online.

  • I'd like to discuss this tomorrow and indeed that's

  • what the syllabus asked them to do.

  • Go to sleep, don't submit your work.

  • We'll figure it out in the morning.

  • And just writing students to write us and meet us halfway

  • under that sort of duress was the intent of the clause.

  • Unfortunately, when it was invoked some number of times

  • that first year, based on the wording of the emails,

  • based on the conversations we had with students,

  • it really devolved into a backdoor to just extensions.

  • We did not believe, ironically, that most

  • of the students who were invoking this clause

  • were actually on the brink of doing something academic dishonest.

  • They were simply on the brink of not meeting the deadline.

  • And so we ended up removing the clause from the courses syllabus, .

  • Ultimately

  • But I'm glad we did try it, but this was one example of a measure that, at least

  • for us, in our context, in our implementation failed.

  • But I do think more compelling has been what we introduced a few years ago

  • in the spirit of the Regret Clause, but whereby we actually

  • initiate the conversations.

  • So it's not infrequently been the case that when

  • we've crossed compared so many students submissions that there's

  • a few cases that seem a little worrisome,

  • but it definitely doesn't seem like it's over the line.

  • We certainly wouldn't refer them to the Honor Council on that basis.

  • But we realized that this then would is an opportunity for us to maybe

  • go chat with those students now and say, hey, listen, you appeared on our radar.

  • We think it's because of the similarities between your code

  • and maybe some other students.

  • And we would leave the other student anonymously out of it.

  • But we would then ask the student, how did you get your code to this point?

  • Walk us through the process and let's figure out

  • how you came so close to what we worried was crossing a line,

  • so that you can just avoid it moving forward.

  • And so these interventional conversations,

  • as we describe them internally, I hope has actually

  • gone a long way to just helping students navigate the waters.

  • Even if they don't cross those lines, they at least now

  • are being more conscious and thoughtful about what it is they're doing.

  • BRIAN YU: And what do you usually gather from those sort

  • of interventional chats?

  • Like what sort of actions you find that students are taking?

  • Does is seem like there's some teachable moment there

  • that you're helping students with?

  • DAVID MALAN: I think so because not infrequently would it

  • be the case that two students were indeed working reasonably

  • on the homework assignment together.

  • But they were perhaps asking each other a few too many questions about code.

  • It wasn't necessarily entirely in pseudocode or in English,

  • their conversations.

  • And maybe one was being shown the other's code,

  • which is allowed within some circumstances per the syllabus.

  • But maybe a little too frequently.

  • And so as such, their work was just sort of over time,

  • converging to become one in the same.

  • And so given that we would have these chats within a week of them having done

  • that, it was usually pretty obvious to students like, oh, let's

  • not do that again.

  • And recalibrate their approach.

  • BRIAN YU: So it seems like all in all, CS50

  • has tried a lot with the Regret Clause, with the Brink Clause,

  • with these interventional chats that you've had with students.

  • A lot that CS50 has done with regards to the issue of academic dishonesty

  • and trying to create teachable moments out of that.

  • And trying to work within the university and with students

  • on how to improve that situation.

  • What do you think are the lessons to be taken away for other courses?

  • What can other classes do, either in computer science

  • or outside of computer science that they can do based on the lessons

  • that you and the course overall has learned

  • from these years of working with these issues of academic dishonesty?

  • DAVID MALAN: I think one takeaway has been just clarity.

  • Our policy in the courses syllabus is not short, but it is detailed.

  • And that's the result of a lot of situations

  • having arisen over the years, a lot of conversations

  • having happened over the years.

  • And so I am glad that we do documents so clearly for students,

  • where the lines are and what our expectations of students are.

  • Toward that end too, I think it has been a good thing that we've introduced

  • these interventional conversations.

  • Even if a course is not as involved in the mechanics of the process as we are,

  • they're not necessarily running software across compare your submission.

  • But when something does appear on the radar,

  • if a teaching fellow or teaching assistant does

  • notice some curiosity in the student's code, it's dissimilar to their code

  • last week or it's a little too similar to another student's, I

  • think just being comfortable reaching out proactively to those students,

  • not to impugn them, but rather to say, listen, we have some concerns.

  • We don't feel you've crossed a line, but we'd

  • like to better understand what you've done and how you did this.

  • So that we can steer you in the right direction moving forward.

  • That too seems a very straightforward, healthy and teachable opportunity.

  • And as for the Regret Clause, I certainly

  • think it's worth trying in other classes.

  • I think it certainly is completely reasonable

  • that a course, whether ours or anyone else's,

  • just clearly defines what steps students should take when they find themselves

  • in certain situations.

  • And prior to the forgot clause it was ill-defined.

  • What should a student do if they make a poor decision, especially late at night

  • and then they do actually regret it the next day or some number of hours later?

  • There was no well-defined process.

  • And while technically, there was nothing stopping a student

  • from coming forward and turning themselves in,

  • I can certainly appreciate the trepidation

  • that a student might have with taking that on not knowing

  • what the outcome might be.

  • Especially, if they assume it might even be time off from the University itself.

  • So I think the fact that we've sort of clarified

  • how to conduct oneself before you get to that point,

  • after you get to that point, and after we have detected as much,

  • is just only fair to students in the class.

  • BRIAN YU: I think there are a lot of very useful lessons

  • there in terms of what classes can start to do about this sort of issue.

  • Certainly, if any of you are interested in learning more about this,

  • we've actually written a paper, the two of us

  • along with Doug Lloyd on CS50s teams about economic honesty in CS50.

  • So we can provide a link to that if you're

  • interested in reading more about the policy and about the Regret Clause

  • and about other interventions that CS50 is made on these sorts of issues.

  • DAVID MALAN: Indeed.

  • The title is Teaching Academic Honesty In CS50.

  • If you want to Google something like that.

  • And if you're more interested in the software side of things and the cost

  • comparison of submissions, if you go to github.com/cs50/compare50 you'll be

  • able to play around with the open source software there as well.

  • BRIAN YU: Certainly, if you have any feedback about today's podcast

  • or suggestions for future podcast ideas, you can always

  • reach us at cs50.harvard.edu.

  • DAVID MALAN: This was CS50.

[MUSIC PLAYING]

字幕與單字

單字即點即查 點擊單字可以查詢單字解釋

B1 中級

教導學術誠信 - CS50播客,Ep.10 (Teaching Academic Honesty - CS50 Podcast, Ep. 10)

  • 4 0
    林宜悉 發佈於 2021 年 01 月 14 日
影片單字